AMODAL SUSPENSION Relational Architecture 8

concept
technology
help+FAQ
precedents
access
activities
photos
videos
press
credits
top

TECHNOLOGY

“Amodal Suspension” consisted of a network of computers that processed user requests on a first-come first-serve basis. Here are the different technologies involved:

The user reached www.amodal.net in a Linux/Apache box behind a firewall. The server ran a queue programmed in perl and shell, served pages and applets, relayed video and sent emails to participants. The top page of the project detected the browser type and directed cell phone users to a special version of the site.

To send or catch a message using the web interface, the user downloaded a Java applet that allowed for real time 3D visualization of VRML files without the need for browser plug-ins in the vast majority of platforms. The virtual world was modelled by Arquimedia in Madrid and the applet programmed by APR in Edmonton. When the applet was loaded it established a constant connection with the server so that what was visualized corresponded exactly to the actual state of the lights in Yamaguchi. This constant information packet consumed about 2KB/s of bandwidth and also updated the usage statistics at the top of the applet.

When a participant sent a message it was received by the server and relayed to a PC that encoded it as a sequence of modulated flashes. The encoding was done with a statistical distribution of frequency of characters for the English and Japanese characters. The more frequent a character the brighter the lightbeam was. Spaces between words were dark. The statistics we used to code our sequences can be found here for English, here for Kanji and here for Hiragana. With this encoding we could beam approximately 4 western characters and 2 Japanese characters per second using the servo-controlled douser of each searchlight.

The message was bounced around from searchlight to searchlight randomly, controlled by the PC. Every time the message bounced the intersection point between the beams was higher, in order to give the connection a time factor: older messages thus gradually got farther from the public on the ground. The searchlights were 20 Syncrolite SX7K units, with 7kW xenon bulbs and a very collimated lightbeam; they had been calibrated in 3D with traditional surveying and were controlled over DMX.

When a message was sent the main server wrote a notification email to the intended recipient. If someone else caught the message both the sender and receiver were notified. As soon as a message was read it was taken away from the sky and it was added to the project log. The log could be viewed at the website in a 3D world that displayed 100 records at a time, —although one could search and view all the records generated by the project. A web page was made for each caught message, featuring photos from eight webcams showing the lights; this page could be seen by searching with the message unique ID number.

A special 3D applet was also developed for “access pods” in media centers and access kiosks. These present the messages as they were caught in a large display, together with the live light simulation. In Yamaguchi this was done by 12,000 lumen projectors displayed on the façade of the YCAM building.

A bi-directional machine translation engine from IBM automatically provided Japanese and English versions of all messages. In the case of the English applet we had also designed a Kanji look-up table so that the system could provide a phonetic equivalency when the Japanese character sets were not available in the operating system.

To send and receive messages with a cell phone a special cell phone web page was programmed with forms that triggered CGI scripts in the server.

A video applet had been programmed to give visitors a real-time view of Yamaguchi. The Java applet showed output from eight Panasonic webcams placed around the YCAM building and networked by either ethernet or sometimes with 802.11. The images were watermarked with the messages as they were caught with a custom application that was written in C. The video applet also showed an aerial view of the city and a watermark subtitle showed the latest message that had been caught at any given time.