Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Asmodean

Pages: [1]
1
Excellent work. Final Fantasy VII would appreciate some love, I'd imagine. I look forward to seeing these implemented.

2
Graphical / Re: [WIP] APZ's New Canon Cloud
« on: 2012-08-21 19:35:10 »
Respect for your determination, getting this out again updated. Well done mate.

3
leostrife,

For now try: ensure SSAO, Bloom are enable via the #define section, you can disable both FXAA, and HQBR, for now. I will have v0.2 out sometime next week when I'm finished updating the new implementation. My new shader doesn't seem to be working on Nvidia though. I'm also in the process of rebuilding my system, I should have a GTX 680 soon hopefully.


Leonhart7413,

The window size variable is you're monitors max resolution, the internal size is limited by your GPU, not display, it's the internal rendered resolution of the game. This can be increased as far as your GPU will allow, you will receive a pop up dialog when starting the game if you go too high for your own card, it's also best to keep it at multiples of the game's original native res. The config file posted is my own modified one, it was included in the bootleg installer I presume.

4
I've tested the shader there on a gtx 260, and I now see what you mean, the shader is most definitely not working as intended on this nvidia card, it looks terrible, and performs terrible.

It performs, and looks much, much better on my own card (radeon 6950). It would seem there is a problem somewhere..

The effects seem to enable (if poorly) separately but when used together, they don't work, and creates a huge slow down. This is not happening on my own card.

Trust me, the shader looks far nicer, when it's actually working. As of now, I'm not sure what where the gpu transition problem is, need more testing. The most frustrating part is: It works perfectly on my 6950 >:

Edit: ps Thanks for the vid upload Leonhart7413, and yeah the shader effects are not working there, I could tell instantly.

5
I've figured out where the performance bottleneck was occuring, so at least that seems to be fixed. During my test, I went from 145 fps in a battle, to 325 after the fix, which is a tidy net of fps. I'll be waiting to get a few more things sorted/updated etc before updating here though.

Edit: I've actually decided to do away with the current AA, and bloom methods, for the reasons: The merged AA (5xBR, and FXAA) are causing issues. The bloom doesn't go as well as I had hoped with the ssao.

I plan to write an edge detect NFAA (normal filter anti aliasing) technique at some stage today, if i can get around to it.

I've also devised a better depth check function for more accurate occlusion (as accurate as you can get on games with 2d backgrounds at least). So far as I'm aware, I'm the only one I've seen to try implementing ssao in a glsl shader designed for ps1-era games lol.

6
Thanks for the good testing feedback. Would you believe I tested it on Aali's driver, Pete's OGL2, AMD Shader Analyser and LunarG, and got none of these errors, I'll have access to a gtx 260 later for testing nvidia, as im running on a 6950. This card seems to be blind to a few idiotic errors that should crop up when I compile it on this card, but they don't :<

Anyway, I'll get on it.

Leonhart7413, any chance you could post a screenshot of v120 in-game, just to see if it looks, as intended please.

7
I've completely rewritten the glsl shader I've uploaded a while back, (It was having trouble on some gpu's with older shader models. So far, it includes:
  • Ability to turn off/on, and customize each feature
  • HDRBloom
  • 5xBR-FXAA
  • Full Screen SSAO (Yes, SSAO).

I will upload it here, in case anyone would like it, during the week if I'm finished/satisfied testing it.

If any other scripters/programmers would care to take a look to help optimize the code, for extra speed, send me a pm.

ps: This shader works both with Aali's OGL driver, and any ps1 emulator with a gpu plugin that supports glsl shaders (Pete's OGL2, gpuBladeSoft etc) for some who prefere to play ff VIII on emu (PC version sprites are horrid)

I'm releasing the above-mentioned shader for testing.

Note 1: Be sure to change the .post shader source name in your cfg file.
Note 2: If you open the post shader you will see 3 version numbers at the top of the page (120, 330, & 420), two of which are commented out. If you have a GPU with shader model 2.0/3.0 (DirectX 9 card) leave it as it is, if you have a GPU with SM 4.0 (DirectX 10 card), or SM 5.0 (DirectX 11 card) you can comment out the #version 120, and remove the comment before your desired version number, to take advantage of more recent OGL version functions, and a bit more performance.
You can also customize each feature to your liking, if you don't like the default setup.
Note 3: If you would like to use these shaders on PS1 Emulator with a GPU plugin that supports glsl shaders (eg: Pete's OpenGL2) rename ComplexMultiShader.post to gpuPeteOGL2.slf, and main.vert to gpuPeteOGL2.slv, and pop them into your Emu's shader folder.

File temporarily removed (not working on nvidia gpu's atm)

Would appreciate feedback.

8
I've completely rewritten the glsl shader I've uploaded a while back, (It was having trouble on some gpu's with older shader models. So far, it includes:
  • Ability to turn off/on, and customize each feature
  • HDRBloom
  • 5xBR-FXAA
  • Full Screen SSAO (Yes, SSAO).

I will upload it here, in case anyone would like it, during the week if I'm finished/satisfied testing it.

If any other scripters/programmers would care to take a look to help optimize the code, for extra speed, send me a pm.

ps: This shader works both with Aali's OGL driver, and any ps1 emulator with a gpu plugin that supports glsl shaders (Pete's OGL2, gpuBladeSoft etc) for some who prefere to play ff VIII on emu (PC version sprites are horrid)

9
FF7 Tools / Re: Kimera: FF7 p model simple editor
« on: 2012-04-19 12:42:30 »
Are you using this model?

FF7 HQ Aerith mod.rar                           megaten         Aerith battle model                     Qhimm FILE


There is an issue with the number of texture slots for this model.
The model has the textures slots set to 12.  The maximum number of slots is 10.
This creates an error in Kimera when loading the model.
You can fix this error by loading the model in PCreator and setting the texture slots to 10.


Thanks for the tip, I am indeed using a custom version of that model, modded by me, but unfortunatly the problem occurs for any model/part, but thats good to know.

If Kimera works fine on a machine and fails in another one, my guess is that most likely it's due to different versions of the VBRUN libraries. I also found quiet a few overflow errors when loading 3Ds files recently because the values weren't automatically casted from Integer to Long. VB is sure a strange language...

Thanks for the info, I haven't gone near VB in about err 6 years, It must be a library/runtime issue then. I'll go check that out, possibly missing some older sets. Thanks again for the advice.

Edit: downloaded every library I could find for VB, problem persists. I'd say it's that casting error. Dunno why it's happening though. I'd take the source and compile it myself to see if there was any luck, but I don't even have VB anymore. Only VS, not sure if there are any plugins to compile VB, doubt it though.

10
Hey,

I'm getting an overflow error on a Phenom II 955, when I try to do any manipulation/rotate a model/part. No such error when tried from an i7 860, any problems with this on AMD instructions?, something's trying to store a value outside it's definition parameters I'm assuming. Wondering if this is a known issue, as I can't use the app from my own home machine.


If anyone knows a work around for this, it would be appreciated, thanks.

System:
Phenom II 955
Radeon 6950
Asus M4A89GTD PRO
Win7 Pro

11
Step 1: remove all the lighting stuff (I don't know why you based it on the lighting shaders in the first place), this will save atleast 20 slots
Step 2: pack all varying variables into vec4 types. each varying will use up 4 slots regardless of its type, that is, 4 * vec3 will use up 16 slots but 3 * vec4 will only use up 12 slots for the same total number of components (12).

Et voila, now you should be well under 64 slots and it should work on most cards.

I'll do that, thanks. As for the overflow, I had only planned to add extra 'fake' lighting contrast to scenes initially to counter the scaling on sprite BG's in FF7, when you're running the game at very high res, the models clearly stand out from the 2d background, and this sorts it.

Of course, when you start someting you mean to do very little on, you end up going a bit overboard and adding too much lol.

Edit: Great work on the driver btw, very impressive coding. I assume you hear this question too oftan but,  do you have any plans on releasing the source?, I'd love to tinker around with it.

12
I tested those again, the code is fine, it's a problem with older GeForce cards i'm afraid, it happends with the older bloom2 shader also, and the old version smartbloom, I'm getting the same error, on a gtx 260, not getting this error with my own radeon 6950.

I'll see if it's anything I can sort out with a work-around.

I have an nVidia card :p and what i meant by SSAA was Super Sampling. Although at one point, i remember it was broken

I assumed you had an AMD card when you said ssaa, as nvidia don't have SSAA, they have supersampling on transparent textures, it's not SSAA :). SSAA takes multiples of the rendered resolution and scales it back down to the screen res to remove the jaggies. ie, 4xssaa @1080p = 4320p rescaled back down to 1080p, so you need no filtering, and it produces 0 blur.

13
@Hellbringer616
I have the FXAA matrix dense enough to complete nullify any screen blurring, I have taken the native x/y res of FF7/8 and multiplied it by 13x2, to support an internal resolution of 16640x12480 with no blurring, as the output res, to max is less than 1:1. Also, by the way, your CCC AA probably isn't doing anything as the backend of the driver is opengl, so you're actually not getting the edge detect SSAA your setting up in catalyst.


@KyubiNemesis
Your card has only 32 or 64 float varying, and it's running out during compilation. I'll take a look at it and upload a seperate version for it later, if I can, else tommorow.



14
I have more-or-less rewritten the original smart shader completely, I have tested it on 4 systems, both AMD, and Nvidia. This shader may be too powerful for older GPUs Also, It doesn't work correctly on older gtx cards, pre gtx 260. The fragment files are importing functions which are not supported by their shader models, newer gtx cards are fine. In other words, if your gpu is ancient, stick to the original bloom shader :)

FXAA, HDR, Enviroment light mapping, support for ultra high resolution without blurring, dynamic scene enhancement contrast.

GLSLSmartFilterHDR

If I get time, I may upload a video, I will see.

15
Sorry about that fellas, was working for me. Use the below files, I've updated the above file aswel. Works perfectly for me.

HDRSBloomv3.zip


16
Just a question
In the opengl.cfg ? :

post_source = shaders/SmartBloom.post
enable_postprocessing = yes
use_shaders = yes

??? Thanks!

I use:

enable_postprocessing = yes
post_source = shaders/SmartBloom.post
yuv_source = shaders/yuv.frag
frag_source = shaders/main.frag
vert_source = shaders/main.vert

Just to be safe.

I've updated the shader again also, its in the above post, should be the last needed update, enjoy.

Edit:
@savage-xp
I'm not 100% on this, but i'd imagine the AA would be incorperated in the post processing effects only, such as fxaa, and the linear filter. I've also tested this, and the GPU control panel enhancements, and/or regedit options have no affect as it's running off the openGL api. I've found no multisampling options for the config file, and I've been pretty thorough lol

17
Those values have been found before, they either do nothing or are still being worked on, thus why they're not officially documented by Aali.

Well, I know for a fact that: The internal res function, window pos, skip_frames, max_lights work, because I've tested them myself. I've yet to see any problems with using these commands either, especially the internal resolution control, as you can actually gain better performance by increasing the internal res, if you have a decent, modern gpu. To gain enough load to enable the gpu to switch to full 3d clocks (which it wasn't at 1080p) as the game is so old, and requires hardly any horse power.

I am currently running FF7 @ 8320x6240, overkill yes, but why not, if you can.

PhenomII 955@ 4Ghz
Radeon 6950 @ 900/1400

I have also updated the fragment/vertex shader files last night, looks very nice in Final Fantasy VII, including the original smart-filterBloom code, and extra lighting functionality to add more depth to scenes, works especially well on higher resolutions with the field texture packs, so the background scenes don't look like mere 2d images with 3d models standing on top of it. Enhancing the lighting/'shadows'(darker areas) of each scene. If I include any more updates to the shader's functionality, I will update it via this post.

Update: HDRSBloomv3.zip

The high dynamic range is now complete in all files to be imported, and looks great imo, should be the last needed update.

18
I'm not sure if these values are generally known or not, but I figured I would share them just in case.

Using a hex editor, I've found a few functions that were omitted in the opengl.cfg file, some including custom internal resolution control, it is best to keep it at multiples of the native res (highest internal res supported using multiples of the native res = 10200x6240), widescreen hack, but it seems this just produces the same black-bar effect as perserve_aspect.

A few of the others are: (function, var, value type) window position x/y = int, refresh_rate = int, new_hp_limit = bool, new_mp_limit = bool, show_light_rays = bool, show_normals = bool, show_tbn_space = bool, max_lights = bool, skip_frames = bool.

I have included a link to an example config file below using the values I use myself. I have also modified the SmartBloom shader for 'fake' HDR, Refraction, Chromatic dispersion, & Fresnel Effect for more depth to scenes, and also support for much higher resolution.

ff7_opengl.zip

HDRSBloomGLSL.zip

Edit: Wrong link for Shader file. Fixed, enjoy.
Regards.

19
Here is the barret remod:

HQBarret.zip


20
I have edited this model a bit to look (imo) a bit more like Barret's original face. Just though I'd post in case the author of the model, or anyone else would want it, with upload consent ofc.


Pages: [1]