In the cacophony of internet storytelling, the emphasis on speed and volume can drown out richer and more immersive storytelling. Part of the problem is the existing platforms and what we've come to expect from them: 140 characters is deliberately restrictive, and blogs are meant to be punchy.Cowbird, the latest project from digital media artist and storyteller Jonathan Harris, provides a new community forum for interactive, multimedia storytelling. The goal is to create a space for "deeper, longer-lasting, more personal storytelling than you're likely to find anywhere else on the web."
The publishing tool, which was released publicly today, allows users to keep personal diaries or contribute their stories to collaboratively chart the progress of a news events, or "sagas," which Harris defines as "things that touch millions of lives and define the human story," like Occupy Wall Street. The story of the Occupy movement—unfolding as a series of moments affecting people across the world, broadcasted via a vast array of media—is knit together on Cowbird as a mosaic of time-plotted images, words, and audio submitted by participants.
There's the image of the first ever general assembly at Zuccotti Park on September 17. There's the moment on October 30 when one protester kisses his girlfriend goodbye before leaving Missoula, Montana to join the movement. Or the story that Harris himself shares of his arrest at Occupy Oakland on November 3. The tool's tagging features allow you to zero in on the different characters or places that compose a story. According to the site, the ultimate goal is "a public library of human experience—kind of like a Wikipedia for real life (but much more beautiful)."
Even if you're not politically inclined, Cowbird provides many features that make the platform stand out from other publishing tools. Storytellers can easily post full-screen photos, geotag moments to create a map of their lives, create subtitles for audio to play alongside images, and even turn those subtitles into links.
Cowbird is slowly adding new members, so if you're interested, click here to apply for an invitation.
Tutpup is a fun site for math and spelling practice. The feature that engages students is the competitive aspect. Once you join a game, you are matched with students from around the world. This simple feature adds a whole new experience.
What if every light bulb in the world could also transmit data? At TEDGlobal, Harald Haas demonstrates, for the first time, a device that could do exactly that. By flickering the light from a single LED, a change too quick for the human eye to detect, he can transmit far more data than a cellular tower -- and do it in a way that's more efficient, secure and widespread.
Do you know that we have 1.4 million cellular radio masts deployed worldwide? And these are base stations. And we also have more than five billion of these devices here. These are cellular mobile phones. And with these mobile phones, we transmit more than 600 terabytes of data every month. This is a 6 with 14 zeroes -- a very large number. And wireless communications has become a utility like electricity and water. We use it everyday. We use it in our everyday lives now -- in our private lives, in our business lives. And we even have to be asked sometimes, very kindly, to switch off the mobile phone at events like this for good reasons. And it's this importance why I decided to look into the issues that this technology has, because it's so fundamental to our lives.
And one of the issues is capacity. The way we transmit wireless data is by using electromagnetic waves -- in particular, radio waves. And radio waves are limited. They are scarce; they are expensive; and we only have a certain range of it. And it's this limitation that doesn't cope with the demand of wireless data transmissions and the number of bytes and data which are transmitted every month. And they are simply running out of spectrum. There's another problem. That is efficiency. These 1.4 million cellular radio masts, or base stations, consume a lot of energy. And mind you, most of the energy is not used to transmit the radio waves, it is used to cool the base stations. Then the efficiency of such a base station is only at about five percent. And that creates a big problem. Then there's another issue that you're all aware of. You have to switch off your mobile phone during flights. In hospitals, they are security issues. And security is another issue. These radio waves penetrate through walls. They can be intercepted, and somebody can make use of your network if he has bad intentions.
So these are the main four issues. But on the other hand, we have 14 billion of these: light bulbs, light. And light is part of the electromagnetic spectrum. So let's look at this in the context of the entire electromagnetic spectrum, where we have gamma rays. You don't want to get close to gamma rays, it could be dangerous. X-rays, useful when you go to hospitals. Then there's ultraviolet light. it's good for a nice suntan, but otherwise dangerous for the human body. Infrared -- due to eye safety regulations, you can only use it with low power. And then we have the radio waves, they have the issues I've just mentioned. And in the middle there, we have this visible light spectrum. It's light, and light has been around for many millions of years. And in fact, it has created us, has created life, has created all the stuff of life. So it's inherently safe to use. And wouldn't it be great to use that for wireless communications.
Not only that, I compared it to the entire spectrum. I compared the radio waves spectrum -- the size of it -- with the size of the visible light spectrum. And guess what? We have 10,000 times more of that spectrum, which is there for us to use. So not only do we have this huge amount of spectrum, let's compare them with a number I've just mentioned. We have 1.4 million expensively deployed, inefficient radio cellular base stations. And multiply that by 10,000, then you end up at 14 billion. 14 billion is the number of light bulbs installed already. So we have the infrastructure there. Look at the ceiling, you see all these light bulbs. Go to the main floor, you see these light bulbs.
Can we use them for communications? Yes. What do we need to do? The one thing we need to do is we have to replace these inefficient incandescent light bulbs, florescent lights, with this new technology of LED, LED light bulbs. An LED is a semiconductor. It's an electronic device. And it has a very nice acute property. Its intensity can be modulated at very high speeds, and it can be switched off at very high speeds. And this is a fundamental basic property that we explored with our technology. So let's show how we do that. Let's go to the closest neighbor to the visible light spectrum -- go to remote controls. You all know remote controls have an infrared LED -- basically you switch on the LED, and if it's off, you switch it off. And it creates a simple, low-speed data stream in 10,000 bits per second, 20,000 bits per second. Not usable for a YouTube video.
What we have done is we have developed a technology with which we can furthermore replace the remote control of our light bulb. We transmit with our technology, not only a single data stream, we transmit thousands of data streams in parallel, at even higher speeds. And the technology we have developed -- it's called SIM OFDM. And it's spacial modulation -- these are the only technical terms, I'm not going into details -- but this is how we enabled that light source to transmit data.
You will say, "Okay, this is nice -- a slide created in 10 minutes." But not only that. What we've done is we have also developed a demonstrator. And I'm showing for the first time in public this visible light demonstrator. And what we have here is an ordinary desk lamp. We fit in an LED light bulb, worth three U.S. dollars, put in our signal processing technology. And then what we have here is a little hole. And the light goes through that hole. There's a receiver. The receiver will convert these little, subtle changes in the amplitude that we create there into an electrical signal. And that electrical signal is then converted back to a high-speed data stream. In the future we hope that we can integrate this little hole into these smart phones. And not only integrate a photo detector here, but maybe use the camera inside.
So what happens when I switch on that light? As you would expect, it's a light, a desk lamp. Put your book beneath it and you can read. It's illuminating the space. But at the same time, you see this video coming up here. And that's a video, a high-definition video that is transmitted through that light beam. You're critical. You think, "Ha, ha, ha. This is a smart academic doing a little bit of tricks here." But let me do this.
(Applause)
Once again. Still don't believe? It is this light that transmits this high-definition video in a split stream. And if you look at the light, it is illuminating as you would expect. You don't notice with your human eye. You don't notice the subtle changes in the amplitude that we impress onto this light bulb. It's serving the purpose of illumination, but at the same time, we are able to transmit this data. And you can just see, even light from the ceiling comes down here to the receiver. It can ignore that constant light, because all the receiver's interested in are subtle changes. You also have a critical question now and then. You say, "Okay, do I have to have the light on all the time to have this working?" And the answer is yes. But, you can dim down the light to a level that it appears to be off. And you are still able to transmit data -- that's possible.
So I've mentioned to you the four challenges. Capacity: We have 10,000 times more spectrum, 10,000 times more LEDs installed already in the infrastructure. You would agree with me, hopefully, there's no issue of capacity anymore. Efficiency: This is data through illumination -- it's first of all an illumination device. And if you do the energy budget, the data transmission comes for free -- highly energy efficient. I don't mention the high energy efficiency of these LED light bulbs. If the whole world would deploy them, you would save hundreds of power plants. That's aside.
And then I've mentioned the availability. You will agree with me that we have lights in the hospital. You need to see what to do. You have lights in an aircraft. So it's everywhere there is light. Look around. Everywhere. Look at your smart phone. It has a flashlight, an LED flashlight. These are potential sources for high-speed data transmission.
And then there's security. You would agree with me that light doesn't penetrate through walls. So no one, if I have a light here, if I have secure data, no one on the other side of this room through that wall would be able to read that data. And there's only data where there is light. So if I don't want that receiver to receive the data, then what I could do, turn it away. So the data goes in that direction, not there anymore. Now we can in fact see where the data is going to.
So for me, the applications of it, to me, are beyond imagination at the moment. We have had a century of very nice, smart application developers. And you only have to notice, where we have light, there is a potential way to transmit data. But I can give you a few examples. Well you may see the impact already now. This is a remote operated vehicle beneath the oceans. And they use light to illuminate space down there. And this light can be used to transmit wireless data that these things [use] to communicate with each other.
Intrinsically safe environments like this petrochemical plant -- you can't use RF, it may generate antenna sparks, but it can use light -- you see plenty of light there. In hospitals, for new medical instruments; in streets for traffic control. Cars have LED-based headlights, LED-based back lights, and cars can communicate with each other and prevent accidents in the way that they exchange information. Traffic lights can communicate to the car and so on. And then you have these millions of street lamps deployed around the world. And every street lamp would be a free access point. We call it, in fact, a Li-Fi, light-fidelity. And then we have these aircraft cabins. There are hundreds of lights in an aircraft cabin, and each of these lights could be a potential transmitter of wireless data. So you could enjoy your most favorite TED video on your long flight back home. Online life. So I think that is a vision that is possible.
So, all we would need to do is to fit a small microchip to every potential illumination device. And this would then combine two basic functionalities: illumination and wireless data transmission. And it's this symbiosis that I personally believe could solve the four essential problems that face us in wireless communication these days. And in the future, you would not only have 14 billion light bulbs, you may have 14 billion Li-Fis deployed worldwide -- for a cleaner, a greener, and even a brighter future.
Using robotics, laser rangefinders, GPS and smart feedback tools, Dennis Hong is building a car for drivers who are blind. It's not a "self-driving" car, he's careful to note, but a car in which a non-sighted driver can determine speed, proximity and route -- and drive independently.
Many believe driving is an activity solely reserved for those who can see. A blind person driving a vehicle safely and independently was thought to be an impossible task, until now. Hello, my name is Dennis Hong, and we're bringing freedom and independence to the blind by building a vehicle for the visually impaired.
So before I talk about this car for the blind, let me briefly tell you about another project that I worked on called the DARPA Urban Challenge. Now this was about building a robotic car that can drive itself. You press start, nobody touches anything, and it can reach its destination fully autonomously. So in 2007, our team won half a million dollars by placing third place in this competition. So about that time, the National Federation of the Blind, or NFB, challenged the research committee about who can develop a car that lets a blind person drive safely and independently. We decided to give it a try, because we thought, hey, how hard could it be. We have already an autonomous vehicle. We just put a blind person in it and we're done, right? (Laughter) We couldn't have been more wrong. What NFB wanted was not a vehicle that can drive a blind person around, but a vehicle where a blind person can make active decisions and drive. So we had to throw everything out the window and start from scratch.
So to test this crazy idea, we developed a small dune buggy prototype vehicle to test the feasibility. And in the summer of 2009, we invited dozens of blind youth from all over the country and gave them a chance to take it for a spin. It was an absolutely amazing experience. But the problem with this car was it was designed to only be driven in a very controlled environment, in a flat closed-off parking lot -- even the lanes defined by red traffic cones.
So with this success, we decided to take the next big step, to develop a real car that can be driven on real roads. So how does it work? Well, it's a rather complex system, but let me try to explain it, maybe simplify it. So we have three steps. We have perception, computation and non-visual interfaces. Now obviously the driver cannot see, so the system needs to perceive the environment and gather information for the driver. For that, we use an initial measurement unit. So it measures acceleration, angular acceleration -- like a human ear, inner ear. We fuse that information with a GPS unit to get an estimate of the location of the car. We also use two cameras to detect the lanes of the road. And we also use three laser range finders. The lasers scan the environment to detect obstacles -- a car approaching from the front, the back and also any obstacles that run into the roads, any obstacles around the vehicle.
So all this vast amount of information is then fed into the computer, and the computer can do two things. One is, first of all, process this information to have an understanding of the environment -- these are the lanes of the road, there's the obstacles -- and convey this information to the driver. The system is also smart enough to figure out the safest way to operate the car. So we can also generate instructions on how to operate the controls of the vehicle. But the problem is this: How do we convey this information and instructions to a person who cannot see fast enough and accurate enough so he can drive? So for this, we developed many different types of non-visual user interface technology. So starting from a three-dimensional ping sound system, a vibration vest, a click wheel with voice commands, a leg strip, even a shoe that applies pressure to the foot. But today we're going to talk about three of these non-visual user interfaces.
Now the first interface is called a DriveGrip. So these are a pair of gloves, and it has vibrating elements on the knuckle part, so you can convey instructions about how to steer -- the direction and the intensity. Another device is called SpeedStrip. So this is a chair -- as a matter of fact, it's actually a massage chair. We gut it out, and we rearrange the vibrating elements in different patterns. And we actuate them to convey information about the speed, and also instructions how to use the gas and the brake pedal. So over here, you can see how the computer understands the environment. And because you cannot see the vibration, we actually put red LED's on the driver, so he can actually see what's happening. This is the sensory data, and that data is transferred to the devices through the computer.
So these two devices, DriveGrip and SpeedStrip, are very effective. But the problem is these are instructional cue devices. So this is not really freedom, right? The computer tells you how to drive -- turn left, turn right, speed up, stop. We call this the backseat driver problem. So we're moving away from the instructional cue devices, and we're now focusing more on the informational devices. A good example for this informational non-visual user interface is called AirPix. So think of it as a monitor for the blind. So it's a small tablet, has many holes in it, and compressed air comes out, so it can actually draw images. So even though you are blind, you can put your hand over it, you can see the lanes of the road and obstacles. Actually, you can also change the frequency of the air coming out and possibly the temperature. So it's actually a multi-dimensional user interface. So here you can see the left camera, the right camera from the vehicle and how the computer interprets that and sends that information to the AirPix. For this, we're showing a simulator, a blind person driving using the AirPix. This simulator was also very useful for training the blind drivers and also quickly testing different types of ideas for different types of non-visual user interfaces. So basically that's how it works.
So just a month ago on January 29th, we unveiled this vehicle for the very first time to the public at the world famous Daytona International Speedway during the Rolex 24 racing event. We also had some surprises. Let's take a look.
(Music)
(Video) Announcer: This is an historic day [unclear]. He's coming up to the grandstand, fellow Federistas.
(Cheering)
(Honking)
There's the grandstand now. And he's [unclear] following that van that's out in front of him. Well there comes the first box. Now let's see if Mark avoids it. He does. He passes it on the right. Third box is out. The fourth box is out. And he's perfectly making his way between the two. He's closing in on the van to make the moving pass. Well this is what it's all about, this kind of dynamic display of audacity and ingenuity. He's approaching the end of the run, makes his way between the barrels that are set up there.
(Honking)
(Applause)
Dennis Hong: I'm so happy for you. Mark's going to give me a ride back to the hotel.
Mark Riccobono: Yes.
(Applause)
DH: So since we started this project, we've been getting hundreds of letters, emails, phone calls from people from all around the world. Letters thanking us, but sometimes you also get funny letters like this one: "Now I understand why there is Braille on a drive up ATM machine." (Laughter) But sometimes -- (Laughter) But sometimes I also do get -- I wouldn't call it hate mail -- but letters of really strong concern: "Dr. Hong, are you insane, trying to put blind people on the road? You must be out of your mind." But this vehicle is a prototype vehicle, and it's not going to be on the road until it's proven as safe as, or safer than, today's vehicle. And I truly believe that this can happen.
But still, will the society, would they accept such a radical idea? How are we going to handle insurance? How are we going to issue driver's licenses? There's many of these different kinds of hurdles besides technology challenges that we need to address before this becomes a reality. Of course, the main goal of this project is to develop a car for the blind. But potentially more important than this is the tremendous value of the spin-off technology that can come from this project. The sensors that are used can see through the dark, the fog and rain. And together with this new type of interfaces, we can use these technologies and apply them to safer cars for sighted people. Or for the blind, everyday home appliances -- in the educational setting, in the office setting. Just imagine, in a classroom a teacher writes on the blackboard and a blind student can see what's written and read using these non-visual interfaces. This is priceless. So today, the things I've showed you today, is just the beginning.