User Experience Design of Ubiquitous Computing Devices

From Cs160-sp08

Jump to: navigation, search

Guest lecture on Apr 22, 2008

Contents

Readings

Benjamin Lau - Apr 18, 2008 11:20:26 pm

Weiser's article on ubiquitous computing is a bit dated (1991) but some of what it argued for eventually happened. We don't exactly have hundreds of computers per room like the article suggested but we have seen a lot of expansion in recent years. A lot of modern cars contain a simple specialized processor. Some professors in Berkeley use interactive whiteboards. And there's talk about adding CPUs to things as mundane as toasters and other appliances. The invention of IPv6 was partially motivated by the desire to create enough address space for embedded devices to have their own unique IPs. And so on. But in general I don't think ubiquitous computing has reached the level that the author was hoping for. There are several reasons for this, one I think is that it's a bit of a bother to manage all these systems. My desktop already takes a fair amount of time to maintain (antivirus, occasional defragmentation, updating various applications, etc), I can't imagine what other things I might have to do in an ubiquitous computing world. Furthermore, with centralized computing in the current paradigm, all of the functionality can be easily accessed through the same interface and at the same time. It's convenient and simple and there has to be tremendous incentive (eg mobile communication via cellphone) to add another system.

Eric Cheung - Apr 19, 2008 02:21:37 pm

I was rather surprised after reading the two articles that ubiquitous computing was actually a much broader area than I had originally thought of. I'd tend to agree with the 2nd reading's suggestion that ubiquitous computing in one shape or another is here, but it's just only being used in certain places. I don't see a huge demand at the moment for the types of ideas proposed by Weiser over more traditional computing in terms of costs and efficiency. I think another stumbling block for the acceptance of ubiquitous computing as only mentioned briefly in the Weiser paper is the issue of privacy. A lot of people would be uncomfortable with their personal data being spread around to the extent that it is in "Sal"'s world. (As a side note, Weiser's description of Sal's world reminded me a little of a product called "LifeWall" that Panasonic announced at CES (http://www.panasonic.com/cesshow/). Granted, it seems a lot more limited than Weiser's vision, but it'd be useful to see what kind of demand there is for that type of product.)

Henry Su - Apr 20, 2008 12:45:46 pm

It is interesting to compare and contrast the world envisioned by "The Computer of the 21st Century" with the world we live in today, since that article tries to predict what life will be like two decades after 1991. The 1991 article describes tabs, pads, and boards that seem to be not only ubiquitous, but also public to use and effortlessly integrated into business. Although we have similar items today (like smart-phones and laptops), used for somewhat similar purposes, they are often owned privately, and also use varying technology that work together thanks to hard work by the designers, as well as agreed-upon standards. Of course, there are some "publicly available" devices, such as library kiosks and automatic toll fee collectors, but they are often specialized in some way--not general purpose public computers. The world presented by the 1991 paper also makes it seem like people are getting tracked everywhere by everyone--today, we know there are limitations to this due to privacy concerns. I agree with the "Yesterday's Tomorrows" article that ubiquitous computing is indeed here already, just that it's not quite like the world described by the 1991 article. The more recent article also mentions that cultural and political differences can make the term "ubiquitous computing" mean very different things in different places. Indeed, ubiquitous computing need not be like the utopia described in the 1991 article.

Scott Crawford - Apr 20, 2008 07:45:12 pm

"The Computer for the 21st Century" reading reminds me of the readings early that talked about the relative pros/cons of having different devices for different tasks or having one device for many tasks. The difference between them is that this article appears to assume that distributed computing is best because swiss-army-knife computing fails to seamlessly embed itself into everyday life. It compares the analogy of computing to the analogy of written language, but doesn't treat the expansion of language upon the introduction of new technologies. In other words, written language is a 'special technology' - as a concept, though the medium is subject to novelties - because it is more closely tied to human intelligence than other forms of technology. Saying that all technologies should aspire to be so integrated into human understanding to become a form of language unto themselves is presumptuous, because the simple solution is to simply modify language to describe all things that the new technology accomplishes. I'm not arguing that it might be a good idea (because I agree that it is), but appealing to the success of written language as an integrated technology isn't the best way to argue for life-integrated computing.

Jonathan Chow - Apr 20, 2008 10:12:51 pm

Weiser's article was quite interesting. While reading his example of Sal's world, I couldn't help but ask myself why we didn't have some of the technology that he mentioned. Items such as the coffee maker that only understands 2 commands makes sense. After all the simplest way to get ubiquitous computing is to make a bunch of smaller, simpler products to take the place of a big PC. But I think we are too enamored with the idea of having one product that can do everything. Maybe that's because we all started with a PC so it kind of anchored us into thinking of all the different things we could do with a personal computer instead of developing new electronics. I also found Weiser's mention of security to be interesting as that was one of the top concerns that came to mind when thinking about all these integrated electronics.

The Bell and Dourish article makes a very good point in saying that Weiser's vision of tomorrow is basically outdated and our current vision of tomorrow is different. It seems to me that in many ways we have become integrated with computers instead of the other way around. Where Weiser mentions using computers to replace very similar physical items, like notepads, windows, etc, it seems like we were not able to just ditch the PC and have instead become very dependent on it and all the features that it offers (internet, email, social networking, etc).

Michelle Au - Apr 20, 2008 11:05:34 pm

Many of the technologies described in Weiser's paper reminded me of many technologies that are in use today. While our world is definitely not the same as Sal's world, there are certain events in Sal's world that have sparked a sense of familiarity. Linking coffee makers to alarm clocks has been an idea circulating around for many years now and there have been a few prototypes created. Virtual offices are also partially here. Microsoft's NetMeeting is a VoIP and videoconferencing application that has some interesting whiteboarding features similar to the features described in Sal's world. Clients can share applications and desktops, allowing another person to see and edit your desktop (and applications). While these features are confined to a main computer and not embedded into everyday objects as in Sal's world, the virtual interaction between two users is similar to Sal and Joe's interaction, and is a good stepping stone for Weiser's full vision.

Nir Ackner - Apr 21, 2008 02:46:29 pm

I have to agree with Henry's argument that while we do see many computers as publicly available devices, they are mostly specialized rather than general-purpose displays, as Weisner describes. This makes sense, due to the exact roadblock Weisner predicts -- we have the hardware, we just don't have the networking and configuration software needed to make it work. Consider using an external projector for a presentation. Still today, at least half the time the presenter is unable to get the projector setup properly at all, let alone do so in a reasonable amount of time. That said, standards like USB have greatly improved the ability of devices to appear and disappear at any time from computer systems. Hopefully, this progress will continue to solve the current limitations on Weisner's vision.

William tseng - Apr 21, 2008 05:28:52 pm

I'm not sure Weiser's vision of ubiquitous computing is exactly what I want. Even though he claims the ubiquitous computing would result in "helping with information overload" I think the vision he painted was actually a worse situation. I think think the current environment we have where there is an additional step towards accessing the information i.e. taking out your cell phone then looking up the weather as opposed to seeing the weather informatino displayed directly on a window is better. We want the information to be easily and readily retrieved but we don't want it presented to us ALL OF THE TIME.

The Bell and Dourish article gives a perfect example of this in its description of Singapore and the ability to access "feng shui" advice, Call a cab, check the weather etc all from cell phones. I also think what people what from ubiquitous computing is different for different cultures / environments and there really shouldn't be just one vision of what ubiquitous computing should be. This is what Bell and Dourish explain when they talked about "e-mosques" and or the acceptance of PC-bangs in S.Korea.

Gary Miguel - Apr 21, 2008 05:51:44 pm

That's pretty cool that we were told which parts of a reading are most relevant. I wish this were the case for all readings.

The Weiser article seemed very weird to me. As I was reading it I thought, "What's the point in speculating about what you and your fellow researchers want the future to be like?" I don't know this for sure, but I would guess that historically, drastic changes in societal use of technology have followed technological breakthroughs and have happened gradually. To take Weiser's example of electricity running through the walls of every home: That has come to pass gradually and in an ad-hoc fashion. There was no mastermind who wrote papers talking about what life would be like once everybod had electricity. And even if there was, it would have happened without him.

I think the Bell and Dourish article addresses some of the futility of trying to impose a particular vision on the future.

Khoa Phung - Apr 21, 2008 07:03:23 pm

It was nice to read the original article from the past and compare it with an interpretation of it in the present. I agree that the vision that Weisner will differ from society to society but it laid a framework for the future developments as well as a new field of research. I believe that Ubicomp is here already, yet the integration is very subtle and therefore hard to notice. Cellphones for example now include all different sensors such as accelerometer, GPS, voice recognition, biometric recognition. Since cellphone ownership has increased over the last few decades, I believe that the cellphone will be our integrated computer and passing information on to our environment to adapt to our preferences. Bluetooth technology enabled the transmission of data. Some examples I can think of are cellphones that unlock and lock your work station on approach. The Toyota Prius also opens the door automatically when the key holder approaches. These changes are so subtle that people dont notice it. I can only see more of this in the future as our wireless distribution becomes more widespread (e.g. the 700 Mhz Bandwidth Auction).

Hannah Hu - Apr 21, 2008 06:34:23 pm

"who would ever use a desk whose surface area is only 9" by 11"..." - oh, the 90s. What fun...

The idea of ubiquitous computing - and excuse me for being a pessimist - will never come into fruition. Weiser compares writing to silicon technology in exemplifying his vision of invisible technology. But the instruments involved in the "first information technology" - pens, pencils, markers, etc. - only come in a very limited variety of shape, size, and material. They do just one thing - lay graphite/ink on paper or any writable surface. A very simple task, a very straightforward process. It is no wonder that writing has been integrated into our lives - by making life easier.

Contrary to that, computing devices strive to do so much, in light of demand for all-in-one functionality. The iPhone makes calls, surfs the web, plays music, and checks the weather. The laptop plays movies, downloads software, stores e-books, connects online, opens up word processors, draws digital images via a mouse or tablet, etc. etc... Given this functionality, it will never be fully integrated into our lives. We don't expect our pens and pencils to do anything more than make marks. Similarly, I don't see why we should make computing devices do more than they should. (Phones are phones...)

But perhaps we shouldn't try to integrate computing devices into our lives. Currently, they are very self-contained. You can use a pen on any surface - paper, the wall, the floors, your roommate's face - and even use it for non-intentional uses (I use pens sometimes to pop keys off my keyboard for cleaning). In contrast, a laptop or PDA does everything within itself and does not interact with the outside world. This kind of self-contained functionality will alienate the device from humanness; we are social creatures, and things that are not so social seem less than human. A pencil can socialize with a surface, a computing device currently cannot.

Is it worth having invisible technology? We already have with writing and paper. But is it worth "evolving" that to typing and screens? There's less space here than I need, but in short, I think it's best to keep the wheel un-reinvented.

Ilya Landa - Apr 21, 2008 09:04:26 pm

An interesting paper. It is always fun to read about future from the perspective of people living in the past. Some predictions seem humoous nowdays - it's nearly 2015, but my flying car and the hoverboard are nowhere to be seen. But sometimes, they aredead on. The video shown during Thursday lecture had amazed me much more that the publically available pads described it this article.

In the video, designers and engineers from 40 and 50 years ago had made amazing breakthrough that define modern world. The mouse, for example, had literally stayed unchanged through the decades. The Light Pen application was equally impressive. I won't go as far as to say that Paint does not have more funcionality than that machine, but most of the basics are there - drawing in real-time right on the screen, moving objects, resizing, copying, etc.

It's funny that even though the technological progress is going at exponential rate, people who try to predict the future are often wrong, but beope who design computers for their own time often end up defining that very future.

Chris Myers - Apr 21, 2008 09:21:40 pm

Weiser defines an ubiquitous computing device as one that weaves itself into the fabric of society. It is tailored to a specific task, is invisible and aware of its surroundings. I would argue that the modern cellphone is nearly this. They are everywhere, billions of cellphone over the world, and focus is less on the device itself and more on the network. They contain general purpose computing hardware, but are specialized devices. However, they are not as ubiquitous as the devices in Minority Report, which were nearly invisible (transparent too), part of everyday life (cereal boxes, etc..), performed their function seamlessly and transferred to other devices without error. It is a perfect example of invisible computing, where the actual focus of 'computation' has disappeared and focus is on the exchange of a video clip or sound bite. The data isn't managed with windowing systems but with the storage devices themselves.

Maxwell Pretzlav - Apr 21, 2008 08:51:22 pm

The message here is clearly in between these two articles. The second article addressed a lot of what the first article made me think: Don't we already have all this? Aren't mobile devices and web-apps bringing ubicomp to life already? I can imagine a scenario much like Weiser's own futuristic vision described today, using Skype and Google Docs to achieve almost the same level of collaboration and communication, just based around the super-powerful desktop rather than tiny devices. Cell phones, smartphones, laptops and desktops are our tabs, pads, and boards, just the 13" and 20" display has turned out too practical to be quickly superseded by tiny and giant displays, maybe because they more correctly respond to the particular needs of digital interaction.

I fully agree with the William Gibson quote of the second article, "The future is already here; it’s just not very evenly distributed", and enjoyed seeing a scholarly paper approach such an idea seriously. While I can't make strong judgments without knowing the middle section of the second paper, I got the impression that economics and social class where mostly ignored even in the recent paper—it seems to me Gibson's quote is not only referring to governments and cultural views, but also financial and class differences in a global landscape. Cell phones may be moving us towards a more "ubicomp" future, but the power of those devices and the networks that connect them continue to be radically different in the First and Third Worlds.

Roseanne Wincek - Apr 21, 2008 08:15:07 pm

I thought that the Computer for the 21st century article was a little dated. It's always funny to look back at when people try to predict the future. I thought it was a little bit like the Jetsons, but in some ways, we are living in a ubicomp environment. Something that I thought was funny was that Sal still reads the paper form of the paper. "She still prefers the paper form, as do most people." Which is totally off of the mark, with most newspapers hemorrhaging due to online competition. I do think that the virtual workspace was similar to things we have today, with screen sharing and online documents. I know that I check the traffic on my phone when I get in the car, and plenty of people have fancier navigation systems with real time traffic info. I also thought that looking up a forgotten colleague is a lot like looking someone up on facebook. I think that web applications are really aiding this push into ubiquitous computing. Now that everyone's information is stored on servers (in email, online docs, or social networks), people can acccess their personal information from anywhere they have a computer (or even just a smartphone). There might not be computers in you coffee maker that know "yes" and "no", yet, but we still interact with far more computers for far more tasks today than in 1991.

Diane Ko - Apr 22, 2008 12:11:06 am

The Weiser article brought up some interesting points from the time period of 1991. The section about computers being able to be extended onto larger spaces reminded me of the actual desk top computer in the movie, the Island (which apparently is a real concept now). In 17 years so much has changed in terms of technology. Even still, the point about people not even really skimming the surface of the capabilities of computing is still valid. The average user of personal computers still have very little knowledge of the inner workings and capabilities of their computer.

Cole Lodge - Apr 22, 2008 12:12:17 am

Contrary to what Weiser believed in his article, It seems that we are moving further away from ubiquitous computing. The number of devices that I personally carry around has decreased, not increased as Weiser believed. A few years ago, I would consistently use several devices: desktop computer, laptop, television, cellphone, mp3 player, etc. Yet now, I only use a laptop (including watching television), an my cell phone. In most areas, we as a society seem to be heading towards larger more powerful all-in-one devices, not several single tasked ubiquitous machines. Although majority of devices are trending away from Weiser design, I can see aplications where ubiquitous computing would be helpful; such as picture frames and house lighting.

Gordon Mei - Apr 22, 2008 12:27:17 am

The notion of the computing device that disappears into the background is an ideal one that makes computers so technical, so separate in its own world from all the other items in our lives. With books or paper, you just shuffle them around, toss them around - they're useful tools in our lives that vanish in the background because, unlike computers, they aren't viewed as expensive equipment that people are afraid to scratch, dent, wet, or any abuse in ways that we find natural and more tangible as humans. You generally don't see avid enthusiast and support communities forming around specific types of paper, or flamewars over which pencil is better than another. The article points out that while portable devices must be carried from place to place, pads are intended to be throw-away computers as is the case with paper.

With paper-based information like a book, you get straight to the first chapter and start reading the content immediately. There's no need to worry about technical troubleshooting details like compatibility, or regular system maintenance of the book. Furthermore, a device standing out as it does now is like an enthusiast endlessly outlining a special aspect of an experience, such as all the deepest technical audio details to an end user beyond what he or she may care to hear. It's important, but assumes too high profile a role.

Weiser thinks we should modularize and compartmentalize, though I believe the larger issue with silicon-based information media is that it currently requires so much focus of attention on the device itself. The article notes that information on things like signs or candy wrappers do not require active attention, and info can be gleaned at a glance. The modularization itself is already underway with the growing usage of powerful mobile devices today, and it's a step towards making these devices more integrated and unobtrusive things that general users won't see as highly technical entities they fear to understand.

Robert Glickman - Apr 22, 2008 01:19:10 am

The Weiser article was interesting in that it certainly described a future that is just around the corner. I was shocked to find that his article was rather old -- from 1991. At times I felt as if I was reading an article maybe written last year, but written in a somewhat naive way -- as if it didn't have the experience of very modern technology to back it up. The description of the technologies that one might encounter in daily life is a futuristic notion from the past which hits very close to home in the present and very near future. Certainly, it will be interesting to see how accurate this story is in the near future. However, while ubiquitous computing seems somewhat interesting as a casual food-for-thought area, it does not really strike me as so useful. The second article was rather dull, as it only seemed to describe what ubiquitous computing is and what can be drawn from it. It still left me wondering, "Why am I reading this?"

Brian Trong Tran - Apr 22, 2008 01:43:27 am

I think the discussion on making computers seem invisible was very interesting. In this class, we've continuously talked about how a change in user interface would make things more easy to use. However, the article states that actually changing the way in which the systems are embedded in our daily life would be very effective in making them easy to use by not requiring intense concentration. Computer development has always centered around solving problems, but making the user think that the problem does not exist is entirely different. Current technology has already lead to a treand in which computers have become more a part of infrastructure. It won't be long until computers everywhere will be the norm.

Brian Taylor - Apr 22, 2008 01:22:56 am

wow, the thought of ubiquitous seems awesome. I can't wait for tomorrow when I'm crumpling up my computer and chucking it into my handy desk trash can. Then I pull another one out of my drawer to start all over on. I though the whole futuristic scenario with Pam or Sam or whatever her name was was pretty interesting. I totally agree with Weisel that ubiquitous computing is just the natural direction computing is headed. Like any other new technology, it will simply be slowly merged into daily life. As it becomes more commonplace and fades into the background of just tools we all use, then it will be less of some glorious, magnificent, awesome computing thing, and just what it should be, my handy tool to accomplishing this task and that task throughout the day. Granted, it will be a powerful tool that can connect people across great distances and other complications, computers as we know them today should fade away into the many tools used for the many activities we will use them for.

Edward Chen - Apr 22, 2008 12:55:45 am

It was interesting reading the Weiser article to see what technologies he predicted to exist in the future to support his ubiquitous computing. However, his overall vision definitely sounds like something to strive for. I've always imagined people becoming almost partially cyborg in the future as a lot of our daily lives become more and more augmented by technology. Already, there are gadget geeks widespread who like to get the latest portable devices that would make their lives easier.

The second article was actually rather interesting read, mainly because how it described the varying levels of ubiquitous computing these days. From how the author described the infrastructure of Singapore and Korea, it indeed seems that those cities have fulfilled more of the ubiquitous computing vision than other cities. It was rather interesting to see how ingrained mobile technology and Internet were in the peopel's lives from the numbers of people with cell phones and connected to the Internet to the how the government issues warnings via the cell phone. People living in those cities have become used that lifestyle that it seems completely normal to them, but to the outsider, it seems like another level of technology infrastructure compared what we're used to in the US.

Jeffrey Wang - Apr 22, 2008 11:17:52 am

"The Computer for the 21st Century" is a classic that describes Mark Weiser's vision of ubiquitous computer devices. A lot of people are saying this paper is "dated," but I think his ideas are very relevant to even today. While computing is becoming more ubiquitous, the potential is so much more in the future. The obvious example is computing on the mobile device. As we learned in this class, there are numerous limitations that we face with mobile computing, few might probably vanish in a couple of years. While a lot of people are trying to force old desktop and internet ideas onto the mobile device, I believe there will soon be a new category of applications that will enhance ubiquitous computing.

One concept in the paper that was a bit confusing to me was the notion of a "pad." It seems like a scrap computer that can be picked up anywhere you go. This is a very futuristic standard, even by today's standards. However, it would definitely become useful if we can achieve that stage.

Jonathan Wu Liu - Apr 22, 2008 11:35:59 am

I definitely agree that the world is walking down the path towards ubiquitous computing and as things get smaller, ubiquitous computing is closer to being realized. Size and portability are necessities for change to happen. Most of all, I think capitalism is the main driving force of trend and needs to remain in place for this to occur. However, it is obvious that the idea of ubicomp is still a little ways away. As Weiser points out, ubicomp will reduce the problem of information overload. We overcame the point of not having enough information through technologies such as search and aggregators, and right now we are still working out how to intelligently reduce information overload. We are part of the advancement towards ubicomp and the future looks exciting.

Glen Wong - Apr 22, 2008 11:15:03 am

I found both articles to be quite interesting. It was also nice that the second article was a sort of response to the first one fifteen years later because it gave a more complete picture and gave closure to the first article. Weiser's predictions in the first article are quite interesting. While some of his ideas seem a bit overkill to me, the fact that this article was written almost two decades ago is pretty impressive. I thought his sketch of "Sal's world" was quite interesting and it makes me think about how things would have been if his visions had actually been realized. In the second article it was interesting to read about how Weiser's ideas are still highly influential in the ubicomp field today. Though the most interesting part of the second article was the discussion of how ubiquitous computing is here today but we've failed to realize it because it has shown up in a different form than Weiser imagined.

Alex Choy - Apr 22, 2008 12:48:40 pm

Both articles were interesting. The second article is like a follow-up to the first one. Many of the things that Weiser mentioned have either been done or are in the process of being explored, such as distributed computing and wireless communication. The example with "Sal's world" is a vision of the future. It is popular to envision the future, as the second article mentions, and cites examples such as Worlds Fairs and Disney's Tomorrowland. In the examples of Singapore and Korea, they have integrated technology and communications into their society. The article cites the numbers and statistics of people who use the internet regularly and own a computer. As such, in Korea and Singapore, ubiquitous computing is more noticeable, especially because it is different from the United States. In both Korea and Singapore, the cell phone is used for most everything. In Korea, the phone is used to pay for online content, and in Singapore, the phone is used to send out government warnings. As Weiser mentioned, ubiquitous computers may tend to bring communities closer together. The constant "connectivity" of Singapore and Korea is an example of this.

Hsiu-Fan Wang - Apr 22, 2008 12:27:58 pm

I remember reading about various people who have wired themselves up and record every waking moment of their lives. I actually find this idea very intriguing (and if they have a newsletter, I wish to subscribe to it). I think that the day of "ubicomp" has come (like everyone else in this page it seems), notably, most average people interact with computers constantly. While the number of obvious computer interactions in a day is higher than Weiser would probably like, people interact with computers throughout the day. Analogous to his motor example where motive force become commonplace and embedded everywhere, I think computers are at the point where they have been embedded everywhere and now provide "computational force". In my readings for the software engineering class it is mentioned offhand that it is estimated that electric razors contain approximately 4KB of software, and similar statistics are given for a few other objects. I think, then, that with the writing example, the argument that writing is so ingrained is only true when professional writers are ignored, and similarly then are the interactions of everyday people. Most people are just fine using an ATM, and are only confronted with the computer inside when they get a Windows error message.

JessicaFitzgerald - Apr 22, 2008 12:09:45 pm

I feel that the Bell and Dourish article are correct when they say that the future is already here. Computers have been integrated so much into our daily lives that they have become a necessity. But I feel that they have not been integrated in the way that Weiser had described.

The Weiser article is mostly outdated and a lot of the things he suggests happening in the future, has already happened. The sense that I got from this article was the way he described ubiquitous computing was almost reminiscent of science fiction. He imagines computers being anywhere and everywhere you can think of. The way he describes it seems as though computers aren't being integrated into daily life- they have taken over it. It seems almost science fiction in a way by being able to retrace what you were thinking at a certain time as he uses the example of thinking about a dress you saw that you liked, and having a computer retrace your steps and find exactly what you were looking at. At this point it seems as though computers have really taken over. However, the storage space ideas he talked about are quite evident in the world around us today. A terabyte of space is not unheard of, and is more common than you might think.

Yang Wang - Apr 22, 2008 12:48:13 pm

While the first paper seems a bit dated, it is an interesting piece. Although I don’t fully understand the reason we are required to read it. It is obviously the prediction he made in the paper didn’t all come true. There are several possible reasons to this. First, people do no adapt new technology as fast as he thought of. Second, physical limitation barreled computers to expand so fast. For example, batteries do no evolve to be as effective as he thought of. Third, some seemed to be unlimited source now seems both common and limited. A terabyte space seemed to be limitless and “delete file would not be necessary”; however, today even a terabyte space will be soon eaten up by huge high definition video files or large video games. The second paper argued that Ubiquitous computing is already here. I’d agree with it in some degree, but a complete adapt to such system takes time for both people to learn and the removal of some physical limitation.

Bo Niu - Apr 22, 2008 12:47:40 pm

Instead of explaining ubiquitous computing, the articles seemed to be more like a scientific fiction novel. I'm feeling this way is not due to the fact that there are many predictions about the future in the articles but because of the way they are presenting the ideas. The article focused too much on listing all the exciting new technologies that may be invented later, rather than explain more about ubiquitous computing concept. So it was just another prediction of future kinda novel to me.

Megan Marquardt - Apr 22, 2008 01:02:17 pm

When reading the first paper, I kept arguing with myself whether or not the idea of ubiquitous computing is feasible. I came to the conclusion that I still don't know, but I'm leaning more towards the opinion of the article, given that some of their concerns are addressed by current cell phone technology. There is no more concern about computers not knowing their position and location, thanks to GPS technology. I also had an initial concern in how he compare ubiquitous computing to how writing and motors have become interwoven seamlessly, because computers are so global due to assumed internet connection, as opposed to having information only specific to the advice. I think that ubiquitous computing is very feasible, but in a different manner than he described, in a manner in which localized computers will be everywhere, aided in small functions around one's house or office building or school, etc, but people will still want to have that one device that controls all of their personal life and synches with the other devices that are smaller and embedded in everyday.

Katy Tsai - Apr 22, 2008 01:20:57 pm

While I think it would be ideal to have the technology we use each day integrated into our daily lives, I think the idea of ubiquitous computing puts into question our privacy and the information that is made available to the world. At this point in time, I would consider technology far from being invisible. Much of our daily interactions are centered around computing devices, laptops and more. And while I can see how written information has become ubiquitous, it is still difficult to see how laptops an cell phones can become that way. I think this idea is often questioned when we see futuristic movies of people who suddenly begin to question a world that is so integrated into technology that no one even notices that they are talking to screens instead of real people. When I think of this, I start to wonder if this really is what we want. Maybe there is something valuable in making technology's presence known. It reminds us that it is just a box of information and nothing more.

Jeff Bowman - Apr 22, 2008 01:44:53 pm

When reading the piece on Xerox's PARC research facility, I couldn't help but oscillate between an image of George Jetson and an image of Tom Cruise in Minority Report. The ubiquitous computing platforms that he imagines will be common "by the end of the decade" (read: 2000) were clearly only halfway there, though his accuracy for some of the descriptions were impressive. The ubiquity of RFID (Translink, and Cal ID cards/readers), portable communications with location tracking (GPS and phones), and even communications input (Livescribe, in a shameless plug) seems to indicate that ten years later we're approaching ubiquitous computing in a very interesting stage.

The fact that many of the technologies he mentions have been integrated into the Internet, and that many of his thin and everpresent gadgets are available in the proof-of-concept stage (e.g. the Chumby, Nokia N800, and the intelligent refrigerator) speaks to the fact that the technology is there, but that we haven't yet crossed the hurdle of user interface that he perceives in the article.

Oh well, that's what lecture is for, right?

Lita Cho - Apr 22, 2008 01:40:03 pm

I feel like the comparisons of each reading, was very interesting. Reading the predictions from The Computer for the 21st Century, I am very surprise how a lot of them have came true, such as electronic chalk. The Wacom tablet is very similar to the descriptions of electronic chalk, just not that fact that the pixels are only black and white. I feel like even though I would love for ubiquitous computing to become a reality in my lifetime, I think that is very hard to believe. I believe writing took a long time before the technology became indistinguishable. We are still using inputs devices formed from the 1700s and they are still distinguishable. The keyboard is just the same thing as a typewriter, expect you are typing into a screen rather than a piece of paper. I feel like that technology is very presence and is not a fabric of everyday life all over the world. In order for ubiquitous computing to become true for the personal computer, I think we have to change the input devices dramatically. I suggest speech recognition is the way to go. The computer should be able to know what you want base upon your every day language in order to disappear from your everyday life.

Johnny Tran - Apr 22, 2008 01:36:36 pm

While Mark Weiser addressed the problem of privacy and security in ubiquitous computing, Jim Morris's solution in the same article (making sure that intrusions leave fingerprints, rather than actively prevent break-ins) seems incomplete, and I hope that it does not become commonplace. As anyone who has frequented Internet forums or communities has seen, real-world ethics don't always translate directly into the computing world, and relying on the assumption that they will remain in effect is dangerous. I think usability researchers tend to ignore security, because a great majority of the time, security measures only serve to impede usability.

It is a difficult problem: too-weak security measures will make ubiquitous computing a totalitarian nightmare, whereas too-obnoxious security will make ubiquitous computing unusable. Extensive research and innovation will have to go into solving this problem.

I do not agree that the computing world will merge into the real world, as Weiser argues. Instead, the two worlds will most likely merge and meld together, with the result being something completely unlike what we see today. It is not just the computers which are changing to fade into the background; we humans are also changing--evolving, perhaps--to adapt to the new technologies that we have created and to integrate them into our lives. Computing will permeate all parts of society not because it will be completely invisible, but because society will reach the point that it has made computing an integral part of itself.

Raymond Planthold - Apr 22, 2008 01:38:31 pm

1. It's clear that Weiser put a lot of thought into the possible implementation of his ideas, but I did find it amusing that several of his concepts, "pads" in particular, had a distinct "Star Trek" feel to them.

2. Bell and Dourish are right on the money when they suggest that standardization might be the biggest hurdle. Small bits of progress have been made in the area of data portability, especially with respect to the Web (e.g. RSS) but it'll take a lot more work before total ubiquity is possible. We can't even reliably exchange documents today without running into headaches, either between different versions of Word, or between Word and other programs.

Yunfei Zong - Apr 22, 2008 01:59:57 pm

The Weiser article talked about how ubiquitous computing was possible, and gave many examples like the tab or the alarm clock coffee maker. However, I'm dissappointed that this is as far as it goes into the realm of plausibility; justifying that they can be built doesn't really say anything towards the possibility of it coming on the market. Even though they might be useable and beneficial to society, there are numerous reasons for why these technologies aren't on the market. Primarily, economic factors make these devices implausible. Building these devices might be relatively cheap nowadays for these devices, but the affordances that these new products bring are not all positive. Electronics need to be charged, can sometimes break more easily [laptop over notepad], and are more expensive generally [like that freakishly expensive lcd table from Microsoft]. They also require more skill to maintanence [imagine repairing a sensor behind drywall].

Randy Pang - Apr 22, 2008 01:50:38 pm

The Weiser article gave a fairly good historical context on which ubiquitous computing was founded upon and the Bell Dourish paper gives a fairly well supported analysis of the current state of ubicomp. I personally agree with the conclusions that the Bell Dourish paper came to, specifically that instead of focusing on ubiquitous computing as this phenomenon which allows you to do a bunch of "neat" things (like the use case given by Weiser), we should instead focus on how to embrace ubicomp to actually add real value within our already connected society. I follow a lot of "web 2.0" (for lack of a reasonable word) news, and a lot of people seem to create a lot of hype about mobile being the next big thing, or specific types of social networks being the next big things, or widgets, or x, or y, etc. But I think that often times so many people get caught up in these pipe dreams of the "next wave" technologies, that few people try to add real value to the world in it's current state (which is pretty cool in it's own right). The use cases given by Weiser may seem neat, but they really just don't seem that useful to me. But having access to the sum of collective human knowledge in the form of something like Wikipedia, that's pretty useful (doesn't hurt that it's pretty neat also).

P.S. I don't know if anyone else had this problem, but the link for the Bell Dourish paper didn't work for me, but you can quickly Google for it and find a superior HTML version from Google cache.

Joe Cancilla - Apr 22, 2008 01:37:18 pm

I agree that ubiquitous computing is an interesting idea, but I don't quite get how our data is going to be coordinated amongst various devices. I agree that computers will be so cheap that there will be "scrap computers that can be grabbed and used anywhere; they have no individualized identity or importance." How will my data be accessible on these pads though? Will all of my data be stored on Google? Will I have my own private server somewhere?

It just seems to me that there are always going to be so many competing ways of doing things with electronics that it will always be difficult to have all of these ubiquitous computers synced up. Take for example the person who owns all apple products. They have a an iTV, a Apple Desktop, an iPod, an iPhone, a .Mac account, a powermac, a Nike pedometer, etc. Syncing all of these devices might be easy, but what about when they go to work and everything is geared towards Windows? Take cellphones as another example. None of them really work together. They are capable of doing a lot more than they do, but they are locked down by service providers.

Hopefully, open source solutions like Android will overcome some of these problems. Check out this augmented reality application that someone is developing for Android. Apparently Nokia has been working on this for sometime, but failed to commercialize it.

Reid Hironaga - Apr 22, 2008 02:07:05 pm

These papers on ubiquitous computing gave excellent examples of practical implementations that most people would be drawn to. Whereas the predictions made in many portions of the readings are not fully in effect now, it is important to note that many of the predictions have become relevant. This paper is definitely one of the less relevant papers as far as designing at the level we are planning our Android projects. All around us we see ubiquitous computing in the forms of smaller and smaller devices as well as microchips being found in every sort of appliance and tool that we use. This saturation of technology is perhaps the ubiquitous nature of the expansion of computing referred to.

Kai Man Jim - Apr 22, 2008 02:15:51 pm

Both papers are really good. Ubiquitous Computing Devices can really keep the computer scientists brainstorming to figure out a better way to push our technology into a new higher level. The examples of PDA and cell phones have become part of our life. Ten years ago, who could think of a cell phone can not only be able to make a call, but also be able to process information like GPS map, calendar, songs, and pictures? But now, our cell phone is just a small device that can do lot of things. Like Megan said, since we have the GPS system to monitor our current location, I think the idea of having a computer to sense where it is locating and process with a specific task that due to its location could be possible. But like the second paper says, we should have a sense that is the future is already there? If the ubiquitous computing is already there, then we should think of more effective ways to try to improve it.

Gerard Sunga - Apr 22, 2008 02:12:34 pm

These papers offered interesting perspectives on ubiquitous computing. The first paper offered some great ideas for the possibilities of complete integration of technology with the ordinary person's lifestyle. It was interesting to see the high hopes for rapid development and advancement in technology that was expected not too long ago, expecting this complete integration to happen within a couple of years from now. I especially liked the in-depth description of the integration, extending to every aspect of one's lifestyle. The second paper brought up some good points in stating the presence of ubiquitous computing in the present (especially with the growing integration of cell phones and one's lifestyle). Bell's descriptions of nations that display ubiquitous computing was fairly revealing of the current state of technology's integration with our lifestyle.

Bruno Mehech - Apr 22, 2008 02:16:51 pm

I agree with Bell's argument that ubicomp is already here, though it certainly isn't very evenly distributed. As Bell says ubicomp does not seem to be taking the form that Weiser imagined. I think that it would be very cumbersome to have multiple tablet like PCs spread out over a physical desk and that it is much easier to have the digital desktop and windows which is easier to use and move around than physical "windows", this of course has been made possible by a great increase in screen resolutions, which was Weiser's main complaint about virtual desktops. I think that the same thing goes for Weiser's other predictions. And as for having many computers per room that the user can have control over just makes things too complicated, not only for the user, which has to keep track of all these systems, but also for the creators that have to make sure all these systems can interact properly ina way that makes sense. One thing that Weiser seems to have gotten right is the smart IDs which right now take the form of RFID, but he might not have predicted the huge problem that people have with privacy, which has slowed down greatly its adoption.

Benjamin Sussman - Apr 22, 2008 02:16:19 pm

For reasons similar to the existence of the "Memory Hierarchy" (why not have hundreds of gigabytes of cache?) I think that we, as a technologically advancing society, have already made great leaps towards what Weiser would consider an environment filled with ubiquitous computing. However, in order for the fantasy world where your phone talks to your fridge which talks to your central heating which talks to your electrical placements which all get fed meta-data from the internet, the largest hurdle (as mentioned before) is a consistent interface. It's now technologically reasonable, in the current day and age, to have both the hardware and the software to not only make these communications happen but to also have them be meaningful and fast. But in order for my "Home System" to talk to my fridge there needs to be a clean and well defined interface, and this is one of the things that the capitalist system is not good at enforcing. If a Sony fridge has the unique "weigh milk" feature, while a Panasonic fridge has the "Soda counter" feature, how are they going to be able to communicate with the numerous other computers in a meaningful way? The other computers would need to have significant expectations about the signals they were receiving and potentially even all have access to a network where they can all receive updates to remain at consistent versions. While network access is not unreasonable, we would need (government?) organizations which mandate specific communication and interface standards to ensure that new products fit in with the system and are backward compatible with old products. In the inevitable event that these standards change, we could potentially make thousands of buildings, homes and institutions irrelevant and outdated because their computers are not compatible with the new standards. In order to prevent this, we may have to make new standards completely backwards compatible, and this would force legacy problems to continually propagate through the years making ubiquitous computers everywhere that much less effective and useful. There are big problems to be solved, and we are not there yet.

Harendra Guturu - Apr 22, 2008 02:19:28 pm

I think the design of ubiquitous computing is something that will be very important since I totally agree that the computing world will evolve to the point where it is an integral part of humanitiy and this evolution will be very painful or stalled if good designs don't provide the bridge. I listened to a talk recently regarding sensors that can be embedded into a person's house and will learn to the person preferences such as temperature and lighting and eventually start setting the appropriate climate settings when the person enters. I think ubiquitous computing such as this still needs good designs so that configuration can be done.

Zhihui Zhang - Apr 22, 2008 02:18:51 pm

The idea of ubiquitous computing is a curious one. Along the lines of Mark Weiser's example of a day in the life of Sal, I once tried a proof-of-concept app for pdas that let you expand your computer desktop to the pda over a wifi network. the idea was that you could drag a document or a image over to your mobile device and take the device with you (perhaps to a different office for discussion like in the example in the reading.) Although the idea was a good one, I found that too much time was spent connecting the various devices together and getting them to operate properly. Technologies like bluetooth that allows devices to work together are a good idea, but currently the set up time is too great. Having a bluetooth headphone that works with all your devices is handy, but the benefit is lost if you have to wait two minutes before you can use it.

Michael So - Apr 22, 2008 02:29:33 pm

The Computer of 21st Century article was kind of scary because the world that was hypothetically presented was like sort of sci-fi movie where people can be aware of where everyone is and can virtually communicate with people as if they were there in person but are not and are able to find people wherever. The analogy presented about how a burglar breaking into someone's home is similar to how bad people can use such technology to invade other people's privacy and things that are not nice to mess with. There will be fingerprints the article says. But even if they are evidence of a break-in, it does not mean that the person who was invaded would be able to regain what was lost or undo what what done.

About the ubiquitous computing, I think that computing devices have penetrated daily life. People are taking for granted their laptops and desktop computers and palm pilots and cell phones. Those devices and other computing devices have become a ubiquitous part of a myriad of people's lives. But I guess computing devices isn't as ubiquitous as paper because there are people who do find computers intimidating and not as automatic as knowing how to use paper. Having those tabs that the first paper was talking about sounds sort of interesting. I would like to see pictures accompanied with the text of these tabs. And a video would be nice too of the imagined world described in the first paper.

Jiahan Jiang - Apr 22, 2008 02:38:32 pm

I really enjoyed The Computer for the 21st Century, though I think the idea of ubiquitous computing comes with a lot of strings attached. When people interact with ubiquitous technology, they aren't as aware of their actions (which is what we try to achieve), however it is more mistake-prone. When we write, we make a lot of mistakes because we don't pay as much attention as we do when we were learning prose or poetry, but normally we have the time to look over, edit, etc. With computing, however, a lot of times mistakes can be very serious, and most times our actions have immediate impact (so we don't have as much time to think and edit), so security, allowed user behavior, data-integrity, etc become a lot more difficult. So ubiquitous computing is not just user interactions, but a lot of safety-control technology that it must come with.

Adam Singer - Apr 22, 2008 02:45:34 pm

I really enjoy forward-looking readings such as Weiser's paper on Ubicomp. It's visionaries such as these that inspire people to innovate and make these predictions come true. One thing in particular that I noticed was his statements about batteries. Weiser claimed that small batteries in the future will be able to provide these "pads" or "tablets" with days of power on a single charge.

It seems as though most of his predictions about performance were correct (CPU speed, network speeds, etc.), the prediction about batteries seemed to be way off. In fact, it seems as though battery technology hasn't progressed much in the last 10 years at all.

Though, admittedly, I don't know much about upcoming battery technologies, I'm excited to see the advances in battery technology in the near future. Interestingly enough, with claims of upcoming "wireless power", the notion of a battery may soon become obsolete. In my opinion, I think getting power to these "ubiquitous" computers is the final obstacle in the way of truly integrating computers into our lives.

Jeremy Syn - Apr 22, 2008 02:40:55 pm

I thought that the Weiser article was interesting in his vision to create an ubiquitous computing environment. His idea of having tabs, pads, and boards was very intriguing but very plausible as well. However, I don't see something like that actually happening until some time after. What he is doing is practically making the use of paper obsolete and having everything we do from writing little notes to presenting to a group with the use of a board, electronical. I also noted the date of the paper when he mentioned Windows 3.0 as being "todays windowing systems". He has envisioned this sort of environment for almost 2 decades now, but it hasn't fully reached there even now.

I noticed how much closer countries like Korea are to ubiquitous computing than we are in the United States. The use of computers is apparent everywhere in Korean, but this can be mainly contributed to the fact that it is a small country and its cities' highly dense populations. That makes other countries such as Japan, who are small and high density populations capable of pulling off a strongly closely connected networking system. Still, I can see that soon American will be able to provide a close connected network and that soon the use of computers will be completely ubiquitous.

Daniel Markovich - Apr 22, 2008 02:11:25 pm

After reading Weiser's article The Computer for the 21st Century, I was quite surprised by the lack of progress we have made in ubicomp in the last 15-20 years. Even very useful devices that PARC implemented in the early 1990's have yet to become widely used. The "boards" or interactive white boards are a good example of this. These would be great teaching aids, making lecture content accessible afterwards by students with minimal work by instructors. The only interaction with such a device was in community college, and the design and quality was horrible, probably not much better than PARC's 15 years earlier. Although ubicomp is much more complicated and the level of technology needed is much higher than which currently exists, I feel that this is definitely a field we need to pay more attention too. The quality and ease it can add to everyday life I believe is unprecedented compared to any other technology we have seen.

Andrew Wan - Apr 22, 2008 02:37:00 pm

Both articles provided interesting takes on ubiquitous computing. I thought Weiser is correct in predicting better (and more widespread) computer use in offices and homes, especially with "panels" and web-access. It seems fairly established that people in general are already very close to internalizing computer usage (at least in developed nations). I question whether the increase in computer use will actually address the issue of information overload, or whether widespread adoption will cause the decline of the "computer addict". Information organization is getting better, but people are slow to adopt/standardize tools, and more content is always being created (there are many more ways to waste time now than ever before, no thanks to the web). As for the decline of "computer addicts", I tend to think that increased computer usage will actually increase their numbers. It seems unfair, at least to me, to compare computers now to quartz radios. The fact that computers can be programmed makes a tremendous difference: software provides easy access to a tremendous amount of potential functionality, whereas radio "hacking" was constrained both by the number of broadcasts, and the availability of parts.

Jason Wu - Apr 22, 2008 02:54:37 pm

Ubiquitous computing as a whole is a concept that I believe will be inevitably fulfilled, however some of his predictions show their dated nature. For example the concept of a battery, and how it's limitations now are trying to be circumvented in various ways. However, he has other insights such as smart IDs which also carry the scary idea of big brother along with them.

However, in the end just as every scifi novel has predicted, I believe computers will be part of everyday life in items that we take for granted to act intelligently. Cars, toasters, etc. The research ad possibilities are there.

Andry Jong - Apr 22, 2008 02:57:48 pm

Although I personally think that the article about ubiquitous computing was pretty interesting and intriguing, I cannot help but thinking how I cannot relate this topic to our project in designing application in Android. Many of the predictions in the reading have become real now, and some have not. I personally think that those predicitons are just happen to match the reality that we even find the predictions to be relevant. However, other than that fact, I think of this reading no different than a mere science fiction; and although in most science fictions the world ends in domination of computers, I would personally try to live in the world where we can use technology to get anything we want. I would love to live in a world where technology is integrated to everything. It just seems cool to me. In general, I believe, that User Interface designer or any computer or any technology's designer would want to help their user get informations and do their activities easier. It is not impossible that the idea of ubiquitous computing might become reality someday. Just like some other science fictions... they might as well come true if we keep on improving our technologies.

Max Preston - Apr 22, 2008 12:56:20 pm

Although it may seem like widespread implementation of ubiquitous computing would make daily activities far easier, there are a number of hurdles and potential problems. For instance, regular appliances that you might be expecting to work with 100% reliability for years could start sharing problems common with computers. What do you do when your toaster crashes and you don't know how to reset it? Depending on the number of auxillary features, appliances and other household devices could also be hacked, the consequences of which could range from irritating to dangerous. For example, let's say that manufacturers secretly start putting wireless adapters into household devices in order to automatically get updates from the manufacturer. If a hacker reverse-engineers how they work, they could inject malicious code which could result in your toast always getting burned or your shower's water always being scalding. But if they gain control of something like your house's heating system, they could potentially start a fire. Besides possible dangers, ubiquitous computing can hurt the consumer as well. In addition to losing reliability, appliances could increase in price and might have shorter life-spans. Ubiquitious computing is important, but we still need to be careful about how far we take it.

Mike Ross - Apr 22, 2008 02:34:51 pm

This first article sounded revolutionary until I realized it was from 1991. These are great ideas, even for now, but actually implementing them in an affordable and sustainable way sounds impossible to me. With the wealth of mobile devices we have now, I can't imagine spreading multiple minicomputers throughout every room in an office complex. I wonder what Weiser's thoughts on the current state of technology are. We have wireless networks in places from office buildings to coffee shops, laptops are in the hands of people who definitely don't fit the category of a "technological scribe" the way Weiser describes, PDAs and cell phones grow more powerful every generation (regardless of American cell phones being worlds behind the rest of the industrialized world's), company intranets and nearly ubiquitous high speed internet access make sharing data just as easy as than carrying and pad between rooms. I really want to know how far we've come in terms of achieving his vision? Anyone can pick up cell phones and computers have made huge leaps in appealing to the non-tech savvy crowds, so that they have become a part of every day life for many people. Anything more almost seems gratuitous. I do wonder how we would interact with computers differently if we worked under Weiser's paradigm of tabs and pads.

At the end of the article, though, it seems like Weiser glosses over a lot of things for the sake of describing some technological utopia. Take the pen sending the quote from her newspaper to the office. If such an interface were designed, it would almost definitely come with options. It would be flexible, otherwise you'd have wasted a ton of money buying an expensive pen/stylus or you would be carrying a dozen pens with you at all times for different functions. I feel like the cost is overlooked in all these examples, and that we can do most of these things with current computers, it's just not in some fantasy setting.

EricChung - Apr 22, 2008 02:44:04 pm

As for the first reading, I find the information interesting. It seems like a "practical sci-fi" world, although while reading it I had a lot of misgivings, since I'm not one of those people who believe techological progress is always good in the long run. However, it is good to see issues of privacy and such things being seriously considered and that misuse of the technology has been focused on, although just because these computers are ubiquitous doesn't mean malicious use of them is going down. "hacking" the old radio stations is not very practical (and you can get somewhat easily caught) while computer hacking can get much more personal with any one individual, without leaving the "fingerprints" behind like in the real world. I'm also not going to buy the whole "cause fingerprints to be left behind" idea until I see more valid proof of something like this. However, the last sentance especially seems very relevant toward this course and that is that systems that integrate into the human's life are much better than systems that cause humans to integrate into itself. I also thing its interesting that this was made almost a decade ago.

The idea that ubiquitous computing is already here is a good one I think; the more communial focus of the original paper is probably not going to come to pass in our kind of society (which is more individualistic and capitalist). The kind where we carry around cell phones and PDAs and digital cameras and where kids grow up with computers to make their uses ubiquitous (along with a little of the embedded computer originally envisioned) is more in line with how ubiquitous computing would pan out in real life instead of in the science fiction books. And when one thinks about it, reality shouldn't necessarily be as clean cut as fiction. Eventually, though, the study of ubiquitous computing seems to be more of an anthropological and socialogical issue, especially if one starts thinking of this as a "present" instead of a "future" thing. In which case, while it doesn't have to be narrowed down to it, this seems to be more of an area of public policy, rather than technological innovation (although it doesn't strictly have to be).

Pavel Borokhov - Apr 22, 2008 03:08:29 pm

I think that Weiser's paper was quite accurate, and in many ways could be implemented fairly easily today (with the replacement of infrared with Bluetooth). In fact, some things, such as cardkey access (was this the first time it was introduced?) is in daily use on our very own campus and the vast majority of access-controlled buildings. However, I seriously wonder when a world like the one that he describes toward the end of the reading, where the computers know all our actions and locations and react accordingly would actually be possible. More specifically, I feel like in today's environment, a lot of people would consider such a computerized environment to be a serious invasion of privacy, and furthermore, even if proper privacy controls were in place, the sheer number of privacy rules one would have to devise to replicate the real-world privacy barriers that we have set up would be absolutely ridiculous. Consider something as simple as the website Facebook - yet, even there, the vast majority of people set very broad privacy settings with minimal granularity, despite having full power to create very specific privacy rules. Now - that's just a single website. Consider needing to set up similar privacy rules for all aspects of your life - the amount of time you would spend doing this would probably outweigh the amount of time you would save by having all these devices around you. Of course, the alternative could be that the line between private and public becomes more blurred, something that is already happening, but my guess is that this would require a few generations to actually happen completely. Of course, in due time, I feel like ubiquitous computing will be inevitable, but the main question is, how long will that time take?

David Jacobs - Apr 22, 2008 11:53:10 am

As cool as the world Weiser envisions sounds, I can't imagine it catching on in the form he describes. The fundamental problem is that computers as they exist today are a part of our lives -- specifically a part that we don't want to go away. Ubiquitous computing as I understand it refers to using computers to make our otherwise analog tasks easier (pushing around pieces of paper, working on white boards, etc.). My qualms lay in the fate of activities that have grown out of modern computing, specifically applications like video games. How does fighting a horde of zombies map into pushing around pads of paper? I suppose the Wii is a first step towards that kind of interaction, but all the same I can't imagine jumping around a room for days on end (which might be a good thing).

Timothy Edgar - Apr 22, 2008 03:18:13 pm

I really like the second article as a commentary on the first. The "sci-fi"ish concept of ubiquitous computing doesn't seem too far off, but at the same time is a grand vision that perhaps is a bit dreamlike. Sal's world does seem like there are a lot of privacy / regulation issues. At the same time comparing 1991 with now, cars have so many microprocessors in them that it can nearly be called "fly by wire." Between digital read outs, control systems and even things such as GPS systems, I imagine you could make an argument that it is a form of autopiloting. There are a lot of aspects that the computers have been embedded in the information processing of our lives and like the second article it'd be a bit interesting to take a step back and understand it, not rather than saying it isn't here yet. Overall, it was a quite interesting read though.

Zhou Li - Apr 22, 2008 03:00:46 pm

The research topic of ubiquitous computing surrounds possibilities and potential integrating computing devices into people's everyday life. Weiser's vision of Sal's world was like a scene from a scientific fiction movie taking place in the future. That's because the central time domain of ubiquitous computing is the "proximate future". Weiser's vision was to let the computers themselves to vanish into the background, so that people would be using them without thinking or even be aware of them. I think his prediction of the future, which we are already in, makes sense, because creating systems that extend or take the appearance of common things people are already familiar with can decrease or even eliminate learning curve while increasing the functionalities and productivities of existing objects. Most of the high tech devices he mentioned in the fictional story are also possible to produce with available technology today. However, economical efficiency might be one of the reasons preventing Weiser's vision from coming true. The cost of making those devices still don't justify benefits they generate. Also, while one can just buy a laptop PC and use it anywhere independently, it requires building massive infrastructure and integration to make ubiquitous computing to really become true in the near future.

Siyu Song - Apr 22, 2008 03:05:03 pm

I found "The Computer for the 21st Century" very interesting. One thing I did notice in the beginning is the author compared computers to literacy technology, and how the current incarnations of computers are where writing was hundreds and hundreds of ago. My questions regarding that would be 1) has literacy technology matured to the point where no futher blending into reality is necessary? and 2) how long will it take the tab, the pad, and the board to fully mature? (Mostly because I would really like to be alive when that happens)

One of the challenges I see to ubiquitous computing is the limitations of displays vs paper. I don't know if one exists yet or if one is in development, but a display that does not need to be backlit and can be seen easier with more ambient light is necessary for computers to truly blend into the background. Many displays are optmial when there is little ambient light, this restriction I think will be a big hurdle making computers and displays truly not noticable.

Ravi Dharawat - Apr 22, 2008 03:23:19 pm

The Weiser article illustrates an extreme in ubiquitous computing that will most likely not come to pass, for a few reasons. One reason, I think, is management. For one human to manage several devices is something of a nightmare. Sure, if they are devices that are nigh invisible and require very little interaction (they are simple), that is fine. It is actually not very different from what we do today (ie, we have different tools which accomplish a few specific tasks in our toolbox). But what about tools for tasks that are not simple, or that need to have highly configurable solutions. Having these on one machine ensures a modicum of similarity, but having several from different providers not only introduces extra costs but different interfaces. This could be solved through the use of strong standards regarding interface, but it is unlikely that this would occur due to commercial interests. What is more likely is that different companies will sell different packages of ubiquitous devices. But a second problem arises: Communication. Ubiquitous devices should be able to communicate, at least in some scenarios, for optimal usage. While that may be solved by a package within the home, what about outside? What about at the mall? This demands another set of standards, which again are difficult to agree upon. My final point regards security. One simple device can prove difficult to secure. What about several different devices, produced by different manufacturers, acting in concert? It seems a logistics nightmare.

Paul Mans - Apr 22, 2008 05:26:46 pm

In some ways Weiser comes off as a cheerleader: he describes his vision of the future with high confidence and in words encouraging to other researchers, but he often doesn't describe the specifics of how his vision will be realized. An example is his grand idea of computers that fade into the background because users don't have to think to use them. This of course is the goal of any interface and Norman would argue is only possible to achieve by following principles of good design by having obvious mappings and and reducing the semantic and articulatory distances. Weiser though is basically proposing a future where computers are devoid of gulfs of evaluation and execution (something Norman et al. claimed was impossible) yet he makes no mention of the problem of generalizability that arises from expression in very high-level languages or how evaluation of an actions results will be easier with the interfaces contained inside his tabs, pads and boards.

I do not want to say though that Weiser's writing does not have value. Utopic visions have a long literary history (Thomas More 1515) and have shown their use as a framework from which to discuss governmental structures. In other arenas like STS (science, technology & society), this style of writing where societal dilemmas are described in a climate of technological advancement has traditionally been accomplished in the genre of Science Fiction. In fact, Weiser himself departs from the nonfiction narrative perspective to describe his utopia from Sal's perspective in a short episode of science fiction. By framing the goals and problems of a future with ubiquitous computers Weiser is continuing in the vein of science fiction and giving amateurs and researchers alike both the fodder for imagination and innovation as well as a checklist of problems to tackle.



[add comment]
Personal tools