Apple seems to have changed some of their indexing/search algos, the result is that the free version of iBeams has plunged in “popularity” in the last week:
A new version of the lite app should show up in a day or two, the lite version is pretty close to what I originally shipped as iBeams. And a new version of iBeams with the infinite recursion code should show up in a similar time frame. I’m really hoping that changes the sales trajectory.
One of the glitches that I had hit while developing iBeams was an odd visual humming when turning motion blur all away up. I’ve attempted to fix this by rendering into an off-screen frame buffer and discovered some nice side benefits:
I’ve seen a couple of discussions on the web suggesting that the iPhone didn’t support Frame buffer objects, they definitely work although I didn’t have much luck getting them to render into a buffer that was bigger then the screen resolution (320 x 480).
By ping-ponging between two FBO’s I was able to create a feedback loop much like you’d get by pointing a video camera at a monitor showing the video camera’s view, or a hall of mirrors. With a couple of parameter tweaks, iBeams now emulates a dynamic Spirograph stretching off into infinity. I’m hoping to get the changes submitted to the store in the next couple of days.
There’s now a iBeams flicker group: http://www.flickr.com/groups/ibeams/
We’re putting the finishing touches on a new version of iBeams, UI screen shots bellow:
It’s now possible to create new sources, new stable targets that aren’t associated with touches. Directly edit the color of beams and the background. Hopefully it’ll be out by next week.
The iBeams lite app has finaly made it through Q&A, it’s pretty weak, I really wish apple had timed demos instead of trying to figure out how to cripple your software so that it’s still useful but still worth buying the full version.
This video was done by Eric March of http://www.frapstr.com/ and shows off some of the latest features of iBeams. Version 1.1 is currently going through Q&A at Apple but should be out in a few days. It adds a number of new visual effects as well as a new user interface with controls for motion blur, the number of beams, speed of the system, and a few other bits.
So after a bit of hoop jumping the app has been up for the last week at the store. I was actually traveling for work at the time. I’m hoping to have a video up in the next few days.
A system for procedurally generating targeted effects I’m sick of particle systems. totally tired of them. Ok have I got your attention?
On a recent project I was on we had a lot of time invested in a really nice particle effects system and I was always amazed what the artists could do with them. But this isn’t about praising particles so on to the gripe: Many of the effects shot off missiles but for quite a number of the magic effects we would just have the player casting the spell play an animation and create a particle system and the target would just create another particle system around themselves. Something always bugged me about this I never really felt like they had anything to do with each other.
Towards the end of the project one of the artists really wanted a lightning effect, and a lightning effect brought the lack of visual connection to the forefront, lightning always come from somewhere and goes somewhere, and there was no way to do it justice with a standard particle system.
So after a bit of hacking and then quite a bit more coding I wanted to present a targeted procedural f/x system for generating point to point effects. Now some of these effects could be achieved with a more standard particle system but most of them dynamically respond to the target location (visualized by the ball) .
In the beam construction phase, the space between the source and target are first tessellated and then various mathematical functions are applied to get different types of paths through space. This may be anything from a sinusoidal path to Perlin noise to a spiral wave. Additionally there are parameters for how the rendering and movement should behave. There are close to 40 variables that handle the construction and movement of the beams. To make the calculations much simpler all the path generation is done in local space the math only takes the scalar distance between the source and the target into account when generating the paths.
Another piece of the puzzle is aligning the beam with the target, which is done by creating a quaternion that represents the transform from a coordinate system with z up to a system aligned with vector between the source and the target. The system can choose to align every frame or every time the beam dies and is regenerated.
There isn’t really any reason that a standard particle system couldn’t use the same paradigm of a smart emitter, for all I know some already do.
Rendering and Animation:
Beams: Since I started out by trying to create lightning I went with a triangle strip rendering technique where I connected the vertexes of the path together. Depending on options, different amounts of the path with different widths are generated. This hasn’t really lead to a good way to use vertex buffers. Dynamic data and static buffers don’t work that wonderfully together. I’m still thinking about how to get more of the code off the cpu and onto the video card. Particles: Just to prove that the system could emulate a particle system there is a particle rendering option as well. This just draws a bill boarded particle at any used vertex in the path. Tubes: Some of the effects really called out for a more 3D feel then the 2D triangle strips could provide, I experimented with a technique called parallel transport to effectively get tubes extruded along the path, still haven’t gotten all the kinks out but combined with a cg translucency shader it is a pretty, albeit slow option.
Tech: I used way too many pieces of other people’s tech for this. I’m using OpenGL, Glut, Cg (nvidia’s c for graphics), IBM’s XML parser for parsing in the data, and GLUI for the dialog box. Download the Demo
Notes: Will probably only work on geforce level video cards If you get an assert: mDefaultShader.ModelViewProj != NULL, file BeamUI.cpp, line 332 you probably have an old version of cgc.exe installed Nvidia hard codes the path to the exe in the registry, you need to either upgrade cg, or blow away the registry key.
Why must everything be named?