It had been an intense week for p-com so my visualizing piece is not a finished work yet. Here’s what I’ve done from last week for visualizing the smell:
In my case I don’t really need all data of the gradient field pixel by pixel since I’m just gonna draw the contour lines for different levels of odor density. Hence I extracted the data only while the odor changes significantly from one level to another and optimized the gradient data file from 5.8MB to 28KB. It not only helps shorten the load time, and dramatically decreases the lag in the draw loop.
Another performance issue found in the iterations was that while the text labels were drawn in SCREEN text mode, it was REALLY slow, taking almost ten times of the default MODEL mode. It’s definitely not acceptable in a visualization aiming for interaction (with a lot of redrawing).
I always wish to present the expected paths of larvae and distinct them from an incorrect navigation. Also I suppose in the data given, all larva steps are sampled on a same interval of time (since they were captured by a camera, maybe controlled by a time lapse program?), then I must present the time factor too in the static image. So here’s what I get:
- Each path is marked clearly with label on both start and end all the time;
- Paths are grouped in similar colors if they belong to same types of larvae (unilateral or bilateral);
- The shape of the ending label changes when the larva stops moving or “quits”;
- The size of the ending label increases through time, so that the user can tell which larva has travelled for longer time, and which one gets to the destination earlier;
- The path goes thicker while it’s moving towards the odor center, and becomes thinner if the larva is moving away from the expected destination;
Refine and Interact
For both debugging and presentation need, I decide to group the paths drawn on the screen in 10, in each one of them one unilateral larva and one bilateral.
- Keyboard shortcut 1 – 9 are used to display different amount of groups on screen, 0 key will show them all;
The user should also have the capability of exploring through time, that’s why a step slider is provided in all view. The user can always drag the slider to a certain point of time to observe the larvae behavior (statistically).
- Keyboard shortcut + and – are used to navigate through time;
One last thing is debug information for the developer (me!). I print out the actual and expected direction of each step for debugging in the debug mode (keyboard shorcut d). In fact it’s visually more beautiful than usual.
There’re also weird thing that I cannot explain throught the data. What’s the real exiting criteria of the larvae? Apparently some of them exit by hitting the wall (orange A,D, blue C), others might be exitting by getting to destination, orange F might triggered a time out, but what’s happening with orange B and blue C? Blue C is even out of the field!
The jpeg screenshot are terribly poor in quality, play with the applet directly via this link. It should be super fast!
As I said this is not a complete work since I just don’t have enough time to finish it. The zipcode exercise was really helpful though in methodology aspect. I didn’t find a good way to feature the integrator class in this smell exercise. But there’re some other points that I wish to improve:
- legends and instructions to tell difference between colors, shapes and strokes;
- data mining: statistic data on different type of groups. Possible axises: quit time (on success/failure), quit steps (on success/failure), etc;
- visual effect: not only dymamic effects for interaction, this data could be rendered as a nice visual art work as well. In that sense more data could be used without compromising to interaction needs;
- what else?