iBeams

iPhone/iPad Graphics Apps

this is a downer

Apple seems to have changed some of their indexing/search algos, the result is that the free version of iBeams has plunged in “popularity” in the last week:

http://www.mobclix.com/appstore/1/app/304286607

A new version of the lite app should show up in a day or two, the lite version is pretty close to what I originally shipped as iBeams. And a new version of iBeams with the infinite recursion code should show up in a similar time frame.  I’m really hoping that changes the sales trajectory.

Advertisement

April 1, 2009 Posted by | Uncategorized | Leave a comment

The Next Version of iBeams and Render-to-Texture

One of the glitches that I had hit while developing iBeams was an odd visual humming when turning motion blur all away up. I’ve attempted to fix this by rendering into an off-screen frame buffer and discovered some nice side benefits:

Video:  http://www.flickr.com/photos/56692683@N00/3384301461/

I’ve seen a couple of discussions on the web suggesting that the iPhone didn’t support Frame buffer objects, they definitely work although I didn’t have much luck getting them to render into a buffer that was bigger then the screen resolution (320 x 480).

By ping-ponging between two FBO’s I was able to create a feedback loop much like you’d get by pointing a video camera at a monitor showing the video camera’s view, or a hall of mirrors. With a couple of parameter tweaks, iBeams now emulates a dynamic Spirograph stretching off into infinity.  I’m hoping to get the changes submitted to the store in the next couple of days.

March 26, 2009 Posted by | Uncategorized | Leave a comment

News: Flickr Group, New version of iBeams

There’s now a iBeams flicker group: http://www.flickr.com/groups/ibeams/

We’re putting the finishing touches on a new version of iBeams, UI screen shots bellow:

New UI Page 1 New UI page 2
New UI page 2

It’s now possible to create new sources, new stable targets that aren’t associated with touches. Directly edit the color of beams and the background. Hopefully it’ll be out by next week.

The iBeams lite app has finaly made it through Q&A, it’s pretty weak, I really wish apple had timed demos instead of trying to figure out how to cripple your software so that it’s still useful but still worth buying the full version.

February 16, 2009 Posted by | Uncategorized | Leave a comment

Video of new iBeams 1.1

This video was done by Eric March of http://www.frapstr.com/ and shows off some of the latest features of iBeams. Version 1.1 is currently going through Q&A at Apple but should be out in a few days. It adds a number of new visual effects as well as a new user interface with controls for motion blur, the number of beams, speed of the system, and a few other bits.

The review at frapster

January 26, 2009 Posted by | Uncategorized | Leave a comment

Ibeams available at the app store

app store

So after a bit of hoop jumping the app has been up for the last week at the store. I was actually traveling for work at the time. I’m hoping to have a video up in the next few days.

January 20, 2009 Posted by | Uncategorized | Leave a comment

The first signs of life on the ipod touch

2 weeks ago 3149981126_f10ef49f02.jpg

Initial Screen Shot

Initial Screen Shot

January 7, 2009 Posted by | Uncategorized | Leave a comment

The Original write up of iBeams back in 2001

A system for procedurally generating targeted effects I’m sick of particle systems. totally tired of them. Ok have I got your attention?

On a recent project I was on we had a lot of time invested in a really nice particle effects system and I was always amazed what the artists could do with them. But this isn’t about praising particles so on to the gripe: Many of the effects shot off missiles but for quite a number of the magic effects we would just have the player casting the spell play an animation and create a particle system and the target would just create another particle system around themselves. Something always bugged me about this I never really felt like they had anything to do with each other.

Towards the end of the project one of the artists really wanted a lightning effect, and a lightning effect brought the lack of visual connection to the forefront, lightning always come from somewhere and goes somewhere, and there was no way to do it justice with a standard particle system.

So after a bit of hacking and then quite a bit more coding I wanted to present a targeted procedural f/x system for generating point to point effects. Now some of these effects could be achieved with a more standard particle system but most of them dynamically respond to the target location (visualized by the ball) .

Construction Phase:

In the beam construction phase, the space between the source and target are first tessellated and then various mathematical functions are applied to get different types of paths through space.  This may be anything from a sinusoidal path to Perlin noise to a spiral wave. Additionally there are parameters for how the rendering and movement should behave. There are close to 40 variables that handle the construction and movement of the beams. To make the calculations much simpler all the path generation is done in local space the math only takes the scalar distance between the source and the target into account when generating the paths.

Smart Emitters:

Another piece of the puzzle is aligning the beam with the target, which is done by creating a quaternion that represents the transform from a coordinate system with z up to a system aligned with vector between the source and the target. The system can choose to align every frame or every time the beam dies and is regenerated.

There isn’t really any reason that a standard particle system couldn’t use the same paradigm of a smart emitter, for all I know some already do.

Rendering and Animation:

Beams:  Since I started out by trying to create lightning I went with a triangle strip rendering technique where I connected the vertexes of the path together. Depending on options, different amounts of the path with different widths are generated. This hasn’t really lead to a good way to use vertex buffers. Dynamic data and static buffers don’t work that wonderfully together. I’m still thinking about how to get more of the code off the cpu and onto the video card.    Particles: Just to prove that the system could emulate a particle system there is a particle rendering option as well. This just draws a bill boarded particle at any used vertex in the path.     Tubes: Some of the effects really called out for a more 3D feel then the 2D triangle strips could provide, I experimented with a technique called parallel transport to effectively get tubes extruded along the path, still haven’t gotten all the kinks out but combined with a cg translucency shader it is a pretty, albeit slow option.

Tech: I used way too many pieces of other people’s tech for this. I’m using OpenGL, Glut,  Cg (nvidia’s c for graphics), IBM’s XML parser for parsing in the data, and GLUI  for the dialog box.  Download the Demo

Notes: Will probably only work on geforce level video cards   If you get an assert: mDefaultShader.ModelViewProj != NULL, file BeamUI.cpp, line 332 you probably have an old version of cgc.exe installed Nvidia hard codes the path to the exe in the registry, you need to either upgrade cg, or blow away the registry key.

lightning


branch lightning


ball lightning


color spray


Curly beams


fire


lasers


Why must everything be named?


spiral Beams


arc paths

January 7, 2009 Posted by | Uncategorized | 1 Comment

Hello world!

Welcome to WordPress.com. This is your first post. Edit or delete it and start blogging!

January 7, 2009 Posted by | Uncategorized | 1 Comment

%d bloggers like this: