So the font editor is nearly complete. Here’s a few screen shots:
- The new font dialog.
- The main font editor window.
- A new color picker.
- Text editor to test your font.
Here’s a sample of text rendering in Gorgon 2.0 (Dorian). It’s rendering 16,019 characters, animated with shadowing (which doubles the character count), plus the FPS counter. When it renders the text, it renders with kerning information (provided the font has any) so the output should be properly spaced. And while it’s doing this, it’s scaling the text to pump up the fill rate.
(The video has since been deleted)
All that at ~75 FPS, that’s not too bad hey?
In this particular “demo” you can see that I’m able to compress and expand the lines of text. This is possible because of the new “LineSpacing” property in the text object. This allows the user to set line spacing by setting a multiplier. For example, a LineSpacing of 2.0 will give you double spacing and 0.5 will only move the lines half way.
Anyway, I’m still plowing through all of this. And I’m pretty happy with the results.
Sweet merciful fuck. That was painful.
I just spent the last 4 evenings/nights writing Gorgon’s font system and it was not fun. And, of course, I’m still nowhere near done (still have to create the TextSprite object, and other the font effects like gradients and GDI+ image brushes). But, I got it working. Not only that, unlike the previous incarnation, it actually implements kerning properly (well, approximated kerning, I’m not dicking around with actual kerning. Fuck that.) One of the things about the 1.x font/text rendering that I absolutely hated was while it was worked, every now and again (especially on smaller fonts), it’d screw up and a letter would appear slightly (or not so slightly) shifted. Quite annoying. Here’s a screen shot showing how awful it was compared to GDI+ and how awesome v2 is going to be (click to expand it):
Note that all text is the same font: Arial, 9.0 point, Bolded and Antialiased. Also note that v2.0 is nearly (but not quite pixel perfect) identical to the GDI+ DrawString version. I think that’s a slight improvement.
Here’s another bundled effect, the Gaussian Blur shader:
You can watch it in higher quality (because it’s hella blurry – no pun) by changing it to HD or better yet, go to the video directly on youtube.
So I got custom shaders up and running in Gorgon 2.0. The new version has always had shader ability because it’s using Direct3D 11, so it’s required to use them. However, up until now the shader code has been hardcoded to use 3 default shaders. But after today, there’s an ability to use customized shaders:
In this little sample, there’s a new system at work. It’s basically a mini effects framework that can do multi-pass rendering and in this video there’s a wave shader and an embossing shader at work on the sprite. If you want to learn more, click the stupid link below
I am sick as a pig. I hate having a fucking cold.
So, I figured out the issue that prompted all of this and it took damn near forever. Apparently feature level 9.x devices can’t copy GPU data to resources (e.g. textures) that have a shader binding flag set to CPU accessible resources. And apparently the D3DX functions to save the texture does exactly that. It copies the GPU texture to a CPU staging texture and writes that out (makes sense, reading from the video device is a no-no). Unfortunately my device object just says “fuck it” and promptly dies when this happens, which seems like a driver problem. Anyway, it’s fixed now. Not that it matters, but it was painful and could have meant the end to Direct3D 9 video card support in Gorgon (which no one probably really cares about anyway).
Unfortunately the fix comes at a price. Part of that price is increased memory usage. It’s painful enough to have to create temporary textures when converting to a format that’s not able to accept anything by standard RGBA 32 bit formatting, but with the feature level 9.x there needs to be another temporary texture that doesn’t have a shader binding flag. It’s kind of gross. The other part is that the only way to get it without a shader binding is to create the texture as a render target (unordered access views would have been nice, but they’re for Direct3D 11 devices only), so that limits the number of formats that can be used when saving.
Anyway, thought I’d throw that out there.
I need some help, I’m having issues when saving a texture and I need to know if it’s a driver issue or something in Direct3D 11. I have a post in the forum about it here. If you can help and you have an Nvidia video card, please go read the post and run the test application (Windows 7/Vista SP2 required). Thanks.
So, I’m moving to a new place tomorrow and work on Gorgon v2 is going to halt for a bit until I get my life back in order. In the meantime, here’s a screenshot of the primitives (rectangles, lines, etc…) that have been making me insane (click it to see a larger version):
You can see the line (barely, I know, you can see it when it’s running for sure) and the rectangle, but I’ve gotten ellipses to work as well. Now, what’s the big deal you ask? (You are asking that, I demand it). And I’ll tell you. Unlike the previous incarnation where the primitives were generated one pixel at a time (very inefficient), this time it’s using polygons to generate the primitives. So a line is using the line drawing on the video card, the rectangle and unfilled ellipse are using the line drawing as well and the filled ellipse is using triangles. So all in all, they’re MUCH faster than the previous version. For more details click the thingy at the bottom there…
So, here’s some more proof that I’ve been working on the next version of Gorgon:
https://www.youtube.com/watch?v=GFO6ZMdV2-A
As per the description on the youtubes:
An example showing the new version of Gorgon.
Currently this is just a simple sprite test using 1024 multi-textured sprites via shaders on Direct3D 11 hardware. It also shows a new feature that’s being planned (but not promising anything) to use the 3D stuff to allow perspective corrected sprites.
This video also shows depth for the sprites by walking a camera into the sprite cloud.
Currently getting about 1200 FPS with this (the selective multi-texturing really slows shit down).
So, one of the shortcomings of the original Gorgon was that there was no support for multi-monitor. And I see now why I didn’t bother… what a pain in the ass.
Anyway, I finally figured it out. See, there are two ways to do multi-mon support in Direct 3D 9:
Continue reading
So, to prove that I actually do work on stuff, I’ve uploaded a new video to the youtubes. This one shows off the ability to use MSAA in the new version of Gorgon.
https://www.youtube.com/watch?v=DhqrL3iVjDU
To get this effect, in v1.x of Gorgon, you’d draw a series of fading sprites (Alpha of 0 from the start position to an Alpha of 255 for the current position). However, in this example I’ve used MSAA to simulate motion blur on a sprite. Nifty eh? On top of the nifty effect we also get full screen anti-aliasing, which is something the previous incarnation of Gorgon didn’t have.
Before you ask: no, motion blur will not be included as a function of the library, that’ll be up to the developer to implement.
Yep, finally. I’ve rolled up all the updates/fixes that were in the subversion repository and put up a new version of Gorgon. Version 1.1.4119.34319 is the latest version and you can get it from here. You can view the change log in this forum post.
Enjoy.
So, I bet you’ve been wondering what I’ve been up to lately… You haven’t? You selfish bastard.
Anyhow, I’ve gotten around to playing with this library I wrote for a bit. I do so little programming on my own time these days and honestly, I never much cared for Gorgon (I felt it could have been better), that I haven’t bothered to try and write anything with it. That my users actually say it’s useful and well written comes as a complete shock to me. Anyway, this last week I mustered up some spare time and I created this abomination:
https://www.youtube.com/watch?v=PqPGa6P52LM
It’s not much, but it’s just a little thing I threw together to see if I could get a “bloom” type effect with a star. I did. And there it is. Note how the surface of the star moves around and all that. Neat hey? No? Shut up.
I’ve limited it to 60 FPS on purpose, but it is fairly swift. I think at one point I was getting > 1000 FPS. However, my vidja card is quite beefy, so take that with a grain of salt. So… yeah…. that’s all I have to say.
Enjoy.