Monthly Archives: February 2008

Snapshot #1: San Jose Museum of Art

This dispatch innaugurates an occasional series of cross-sections through contemporary photography, which is far too broad to tackle as a whole — though I tried a couple years ago for a presentation. Because I’m lazy busy and inherently unfocused, I’m going to let other people do the work of slicing through the contemporary art photography world and share their results, neatly digested here.

Recently I visited the San Jose Museum of Art, which displayed photographs from the following artists:


Todd Hido


Susan Felter


Larry Sultan


Amanda Marchand


Edward Burtynsky


Richard Misrach


Michael Wolf


Binh Danh


Kimberly Austin


Vic Muniz


Lewis Baltz


Stéphane Couturier

Posted in OPP, Photography | 2 Comments

Come join us…

If you want to change careers or know someone who’s looking for a job, consider working for The MathWorks. We’ve grown almost continually over the ten years that I’ve been working there, and it looks like this trend continues.

Wednesday, our HR people were handing out pink sheets — ironic? — containing a list of dozens of open positions. I won’t post the whole list, but here are the highlights:

So, if this sounds like the kind of thing you’re intrested in doing and you want to work for a financially successful, privately owned, values-driven company, then take a look at what we have to offer.

Posted in General, Software Engineering | 1 Comment

The clip show episode #2

There’s a whole lot of good stuff out there on the Internet. Here’s just a bit that I discovered recently:

  • The Nerd Handbook — “A nerd needs a project because a nerd builds stuff. All the time. Those lulls in the conversation over dinner? That’s the nerd working on his project in his head.”
  • Out Loud — “That’s your goal, and you can have a wildly successful presentation without achieving it, but a one-slide presentation represents the ultimate commitment to your audience. It says, ‘This isn’t about slides. This about me telling you a great story…’” (as seen at 43Folders)
  • ForTheScience.org : MacOSX Leopard extended ls — The meaning of the “+” and “@” values in the long ls listing. Also working with ACLs and resource forks.
  • Stevey’s Blog Rants: Emergency elisp — “Most Lisp introductions try to give you the ‘Tao of Lisp’, complete with incense-burning, chanting, yoga and all that stuff. What I really wanted in the beginning was a simple cookbook for doing my ‘normal’ stuff in Lisp. So that’s what this is. It’s an introduction to how to write C, Java or JavaScript code in Emacs Lisp, more or less.”
  • California Photography Galleries and California Gallery Guide
Posted in Computing, General, Software Engineering, This is who we are, Uncategorized | Leave a comment

Electronic Imaging 2008: Your eye and novel single photon detectors

I’ve been telling everybody about this particular paper presentation at Electronic Imaging, because I think the implications are pretty cool. It’s yet another presentation where electrical engineers and image processing folks are treating machine vision more like human vision than they have tended to do in the past. This phenomena isn’t exactly new — for example, retinex goes back over 35 years — but it’s a refreshing to see that we can learn from neuroscience at the same time that we are shaped by computers. Anyway, that’s enough Arthur C. Clarke for now.

Before I give you some scanty details about Hooman Mohseni‘s work into single-photon detectors, I’ll do what he did in his presentation and discuss rod cells, the eye’s own single-photon detectors.

A photon strikes a rod cell, interacting with rhodopsin in the cell membrane causing it to close the flow of ions into the cell causing a charge to build up, which is transmitted as an impulse to the rest of the visual system.

The rod cells in your eye are almost perfect single-photon detectors. When a photon of light in the visible range strikes a rod cell, it causes a chemical reaction with rhodopsin, a photo pigment. The rhodopsin molecule changes shape, closing an aperture in the rod cell wall. This partially interrupts the flow of charged sodium ions into the rod cell, and an electrical charge builds up. The charge causes an impulse that is transmitted to the rest of the visual system, possibly causing the sensation of seeing light. There are several layers of rhodopsin in the cell, allowing a greater impulse when more photons strike it. (You can read more details about rhodopsin and its paramour retinal if you want.)

Each rod cell can detect a single photon event, and rods are photon number resolving, meaning that the visual system can use the impulse to distinguish whether one photon, two photons or twenty photons struck the rod cell. And the false count rate is very low, approximately one per 100 seconds. And all of this happens at low electricity levels. Pretty cool, huh?

Compare that to imaging sensors like CCDs, where at very low illumination levels noise overwhelms the actual signal. Even very accurate detectors that can detect single photons reliably (such as photo-multiplier tubes) require large amounts of voltage and usually can’t resolve the difference between one or two photons. Signal-to-noise ratios have been pretty bad so far, too.

Prof. Mohseni’s group at Northwestern has been using nanotechnology fabrication techniques to create a “focalized carrier augmented sensor (FOCUS)” which can transform a single photon event into an output of about 1,000 electrons, at very low voltages. Essentially they have produced something like a rod cell.

Their detector works really well, but nanofabrication is hard and slow. They can’t even image the small, tube-like sensors because they are too delicate; for example, atomic force microscopy — a very gentle technique — destroys the tubes. Nevertheless, this may be the future of things like night vision, positron-emission tomography, and so on.

Posted in Color and Vision | Leave a comment

Electronic Imaging 2008: Color Universal Design (and a MATLAB-based simulator)

Note: Here’s another dispatches about what happened at the Electronic Imaging symposium even though I’m back from San Jose. They will continue until I run out of useful things to write or something more interesting happens in my life.

Yasuyo Ichihara of Kogakuin University presented some research that she and her colleagues performed on what many of us call “color blindness” or “color deficiency.” She would prefer that we say these differently sighted folks have “color confusion,” which I’ll probably try since it’s slightly more evocative than “color deficiency.” And from a PC point of view, it may not be nice to stagmatize roughly 5% of the population as deficient, etc.

Anyway, they presented research of how their lab applied color universal design to remake the map and timetables of the Tokyo subway system to be friendlier for people with both protanopia and deuteranopia. It turns out that the color confusion can be quite simply modeled by placing two iso-chromaticy lines on the standard x-y chromaticity diagram. Along those lines, all colors appear to be the same hue; so it’s not too hard to pick colors that are maximally contrasting by avoiding the iso-lines. They picked four colors, which they claim are suitable for many print applications: black, orange-red, bluish-green, and a more pure blue.

Color universal design doesn’t stop at picking the right colors. Adding non-color information — such as letters, underlining, framing, or other symbols — proved to be very useful aids to comprehension, too.

All of this reminded me that awhile back I implemented a color deficiency simulator in MATLAB based on work by Hans Brettel. I’m making these functions available here in the hopes that they’re useful and that you actually use them when designing something colorful. Simply download the protanopia and deuteranopia simulator M-files and the associated data file. (You will need the Image Processing Toolbox, since these functions use rgb2ind.)

Let’s see what the peppers image I created some years ago looks like to people with color confusion:

rgb = imread('peppers.png'); figure; imshow(rgb); title('Normal color vision') figure; imshow(deuteranopia(rgb)); title('Deuteranopia') figure; imshow(protanopia(rgb)); title('Protanopia')


Posted in Color and Vision, Computing, MATLAB | 4 Comments