Historical Perspectives

From CS 160 User Interfaces Sp10

Jump to: navigation, search

Lecture on Apr 12, 2010

Contents

Readings

Mirror for people from off-campus networks or who don't have ACM accounts
Mirror for people from off-campus networks or who don't have ACM accounts

Mattkc7 - Apr 15, 2010 10:35:13 pm

Daniel Ritchie - 4/9/2010 0:16:00

It's my second time reading this article, and I still can't help but marvel at how accurate Bush is in many of his predictions. Of course, the exact implementations of his hypothetical tehcnologies never came to pass, but the ideas absolutely took form. A tiny, portable camera that we can use to record our every observation? Every cell phone now comes equipped with such a device. "Wholly new forms of encylopedias...ready-made with a mesh of associateive trails running through them," and constructed by everyday users--sounds a bit like Wikipedia, doesn't it? And of course, I'd be remiss if I neglected the memex. While computers certainly existed pre-WWII, it seems to me that Vannevar Bush proposed the essence of the *personal* computer. Sure, the memex doesn't really compute anything per se, but the way in which Bush describes the device--focusing on the user experience over the technological substrates--evokes the desktop PC experience. In fact, I wouldn't be surprised if the "desktop metaphor" for personal computing had its roots in Bush's memex.

Of course, not all of Bush's predictions proved accurate. Speech-to-text devices, such as the Vocoder he describes, have yet to gain widespread acceptance and may never get it. Nevertheless, Bush keenly understood that in the future, information would be king, and he clearly grasps the importance of external cognition in taming that information (for instance, he calls the memex "An enlarged, intimate supplement to [someone's] memory"). He's also something of an early cognitive scientist, citing the way in which humans process information as an argument in favor of an associative information store. All of this is clearly a precursor to human-computer interaction (or human-machine interaction, as one might have called it in Bush's time) as we know it today.


Alexander Sydell - 4/10/2010 14:16:51

I found the analogues between what Bush describes as the future and the things people are trying to accomplish today very interesting. He talks about having smaller cameras, faster and easier storage of photos, compression of data, and machines that record or project sounds. Today we have most of those things, but we are working on making them all better - smaller cameras, better compression, better speech recognition, etc. It is interesting that although we have most of the technology that Bush talks about, we are still trying to make it all better in the ways he describes. The Weiser article also offers an interesting perspective on how far we have come today - although we have the faster processors and thinner and higher-resolution screens that he discusses, we are nowhere near his concept of ubiquitous computing and have nothing like the tabs he discusses, although we are getting closer to the pads and boards.


Annette Trujillo - 4/11/2010 0:13:34

I think the article "The Computer for the 21st Century" does a great job at making predictions about the future of technology, back then. It was predicted that computers will grow in popularity and that display prices will fall while their resolution rises, and this is exactly the case. Now these resolutions are expanded to TV's. Technology has definitely become more advanced and affordable, just like the price of memory is rapidly going down. Processors have also become much faster and affordable as well. I think the idea of pushing computers into the background is very realistic and will probably be true within 10-20 years. Technology has already become so common in every day life (like electricity, phones, tv's, computers) that it just comes naturally to use them every day.


Kyle Conroy - 4/11/2010 0:48:06

The "ubiquitous computer" envisioned in 1991, full of millions of computers in our daily lives, is steadily becoming a reality. Today, we carry around computers which know our current and past location, communicate with a seamless high speed wireless network, and have more processing power than laptops from just a few years ago. With the popularity of smart phones on the rise, the five years will be very interesting to watch. Soon, everyone in the United States can be expected to own one of these devices, creating a permanent connection to the internet. This wireless link, connecting all people, will soon disappear from thought and instead be the norm. The paper is correct to bring up privacy concerns, as we see today how social networks have shattered previous privacy expectations. It turns out that people are quite comfortable sharing their personal information with friends. However, there will always be a small group of people who will refuse to carry a mobile device, just as there are people who refuse to own a car or buy cable. Connection scares some people. The biggest worry I have about the coming future is the ability for computer to break into our worlds, and not the other way around. If instead we become a society in which our faces are always glued to screens, I fear that we become more interested in hanging out with people online that in real life.


Bryan Trinh - 4/11/2010 16:48:23

My first thoughts after reading these two papers was, wow their predictions on the computing world were remarkably accurate. The actual embodiments of their ideas did not pan out the way they had anticipated, but their underlying themes all exist. This idea of a network connecting microfilms together, the memex, has obvious connections with the internet that we use today to search through and gather information. The ubiquitous computing alluded to by Weiser is realized by the hundreds of mobile devices and computing tools that everyone carries around in their pockets today. What's even more exciting is that we live in the epoch where Weisner and Bush's ideas will flourish into vivid reality. We live in an exciting time indeed. Good read.


Jessica Cen - 4/11/2010 22:52:50

I agree with Weiser in his article “The Computer for the 21st Century” when he says that “hundreds of computers in a room could seem intimidating at first” and later these hundred of computers will come to be invisible to common awareness. That is what is happening right now with laptops, smartphones, and any other technology being introduced every generation. At first I thought it was intimidating to have internet access through your cell phone, since cell phones were only invented to communicate with others by voice. But now it is assumed people check their emails at least once every day and people make sure to make that assumption true by carrying a big communication device such as a smartphone on their pockets at all times. Now I am used to looking at people with their smartphones everywhere and I don’t feel intimidated by the invasion of these devices in our lives anymore.


Matt Vaznaian - 4/11/2010 23:31:02

Well, it looks like were halfway there. I really like these guys analysis on where we will be in terms of ubiquitous computing in the next twenty years. Some of the things they talk about are already taking shape in forms like the iPhone, iTab, and a computer which can keep preferences for multiple users. I don't really know how I feel about having multiple "tabs" laying around my desk, though. I think to be able to use one device for multiple conversations and tasks is a better direction, and one I think we're headed in already.


Eric Fung - 4/11/2010 23:47:45

These two articles argue that the most lasting developments in technology are the ones that make the technology invisible and seamless in daily life. Furthermore, the machinery should make repetitive or logical tasks easier, rather than replace the user's need to do abstract or creative thinking. From the sample scenarios describing possible computer usage in the future, it is clear that technology's purpose is merely to seamlessly assist the user perform his work. They should serve as ever-present extensions of memory, recording photos or notes or trails of thought as needed.

The article mentions that the human mind operates by association, so it's interesting to note that the tagging system popularized by the Web 2.0 movement took so long to take hold. This tagging system is a step in that direction of recall by association. The current tagging system could be improved by also allowing for display of similar tags (like grouping 'photos', 'pictures', 'pics', 'photography' into one set) in case you forget exactly which word you used as your tag. This would heighten the use of recognition over recall.


Daniel Nguyen - 4/12/2010 0:43:56

Weiser's article was written in 1991, and now, almost 20 years later, I believe that his idea of "ubiquitous computing" has yet to get the point that he predicted necessary. While some technology such as cell phones and personal computers have become necessities in daily life, the learning curve of utilizing technology to its full potential has prevented computing from becoming second nature to the population. It could be that silicon technology is still in a middle ground where those who have grown up knowing computing naturally are not yet old enough to comment on it and those who are old enough to comment on it are having trouble learning to use technology. For computing to become as natural to us as reading and writing are, it needs to be more thoroughly addressed and utilized in schools from a very young age. Even in the current day where many teenagers and adolescents grow up using computers, cell phones, iPods, and other technologies, the level of knowledge is far from uniform across individuals. For ubiquitous computing to be achieved, the standard level of technological knowledge should be higher and more uniform across similar age groups. Either this, or the level of technology used on a daily basis should start low and be slowly moved upward. Writing is no doubt ubiquitous, but only because the amount of knowledge required by the average person to naturally view writing is very limited. Stop signs and billboards do not require knowledge of complex novels, so technology used in everyday life should not expect extreme fluency in computing, like many things do.


Jason Wu - 4/12/2010 0:54:08

In "As We May Think," Bush claims that systems of indexing are poorly-suited for humans because we think associatively whereas files are usually sorted using artificial criteria such as alphabetical or numerical order. This makes me wonder why the file system metaphor of directories/folders/files is still used to this day. There must be some computer filing system that more closely follows peoples' association of thoughts; perhaps with more powerful hardware and advanced computing techniques, computer scientists can come up with a better system that can anticipate and display items that the user associates with the item they are currently manipulating. Also, I can't help but think about Wikipedia when Bush says that encyclopedias of the future will have associative trails running through them. It is so easy to access information related to an entry that I become engrossed in a topic and begin to perusing through layers after layers of links for hours.

In my opinion, "The Computer for the 21st Century" is particularly relevant right now. Ubiquitous computing is fast becoming a reality: most consumers own mobile phones, GPS devices, MP3 players, etc., all of which can be carried around and used at any given moment. Weiser's idea of pads sounds almost exactly like the recently released iPad. Perhaps in the future, when the prices of such devices fall greatly and people can afford to own multiple pads, Weiser's vision will actually come true.


Aneesh Goel - 4/12/2010 0:54:46

The article from 1945 raises a fascinating distinction that is easy to overlook even by someone directly involved in 'information technology' - the idea that IT provides a subset of technology not for enhancing the physical abilities of a person but the mental. It is amazing to see a prediction of modern technology, but also provides insight into the nature of its development; these predictions are possible because despite all of the breakthroughs that have led to making Bush's visions a reality, these advancements are fundamentally continuations of trends already present in the 40s - increased miniaturization, for example, and increased complexity in computational machines (driven largely by the war effort in general and cryptography in specific); despite how revolutionary the personal computer has been, it is still just an extension of technological trends far predating Moore's Law.

This makes the more recent article, from 1991, more interesting; it too is predicting based on trends, but in far more detail as it looks at a much closer future. In fact, in the less than twenty years since its publication, the article has already had several of the technologies it discusses become commonplace and some of its predictions come true; embedded systems exist invisibly in just about every facet of our lives, unnoticed because the shift in interfaces was never noticeable, or because their interactions have just become part of the typical task (the thermostat example from the article, certainly, but also washing machines and dryers, or cars, or even the lamp on my desk (which can connect to various iPods and iPhones).


Wilson Chau - 4/12/2010 1:04:06

These readings were different from the other readings we have done in that that don't really focus on specific user interface topics, but more on the future of computing and also the future of how people think. These readings took a step back and started to talk about the information itself and how we process and look at it. This is important to know in interface design because the interface is how we get the information.


Boaz Avital - 4/12/2010 2:04:05

The readings, especially the Bush reading from the 40s, make predictions about what the future of technology will be in a few decades with astonishing accuracy. People are amazed by this because they are not used to it in any other field, since often new technology is seen as a combination of unforeseeable advancements and transient needs. However, computer science and computer technology falls very far from that characterization because unlike, say, the cotton gin, high technology has almost no basis in the real world. It is limited almost entirely by human imagination. So in the 40s when someone is envisioning a personal computer, specific internal technologies aside, it is less of a prediction and more of a goal. All that is needed to fulfill it is time.

The Weiner reading also said something interesting right away about technologies that I agree with: You know a technology (or in the case of this class a user interface) is truly good and useful when people forget that it's there. Like the pen and paper for the goal of writing, if the user can forget that she's using an interface and be completely goal oriented then the interface is a success.


Sally Ahn - 4/12/2010 5:00:09

Mark Weiser's article paints a glorious scenario of ubiquitous computing. His observation that "whenever people learn something sufficiently well, they cease to be aware of it" leads him to define "ubiquitous computing" as "transparent" technology that blends into people's everyday lives. This was a new perspective for me, and I although I think I agree with Weiser on most aspects, I cannot help feeling a bit wary of its implications. For one thing, I agree that the goal of "invisibly enhancing the world that already exists," that Weiser defines for ubiquitous computing would be much more beneficial than "simulating the world" (as in the case of virtual reality). Nevertheless, although the "tabs, pads, and boards"in his scenario seem to make Sal's life much easier than ours, they also creates social issues. Privacy is one that Weiser points out himself, and he dismisses it with his faith in cryptography. However, I am also concerned by the "invisibility" of the computers--the very quality that makes them "ubiquitous". It seems to me that a fault in a system could have much greater impact as well as be difficult to fix in this world of ubiquitous computers. Of course, bugs have the potential to create enormous harm in current technology as well, but in the case of ubiquitous computing, the potential for damage seems greater because functionality depends on interaction among many small computers, each with its own chance of having an error. Thus, while I would like to envision the world with the convenience that ubiquitous computing could provide, I am also hesitant to trust its reliability.


Tomomasa Terazaki - 4/12/2010 9:05:08

These articles for today were a little different than usual articles. Usually we read something about interface or how to make a better interface through learning about how human mind/body works. The first article, “As We May Think” was an article on how much time and effort it takes to make new products. It has so much work put into it and the future of the technology will be unbelievable. Since the goal of technology is to make something more convenient, most technological objects will be faster, smaller, and better functionalities (for example it said in the article that cameras will be able to take pictures further away and be able to enlarge it without getting them blurry. The second article “The Computer for the 21st Century” concentrated more on computers as the title gives away. It talked about many of the computer related objects like pads or screens. I really did not understand the importance of the Sal story. Maybe the author thought it would make it easier for the readers to understand but I was just confused.


Jungmin Yun - 4/12/2010 10:17:51

These articles are pretty interesting , but they are somewhat old. They discussed how can make computing more helpful and easier to use. The first article introduced the memex, which is basically a technology to store data so that users can access them easily. As of now, we have ways to store tons of information, but don't have good ways to access them. This article gives some ideas for some solutions to this problem, but we are not nearly there. The second article talked about the concept of ubiquitous computing. The idea was innovative at the time, and even today it still proves to be effective. The benefits it gives are obvious. I like the idea that everything is connected as a network, and information is shared arose devices. However, there are also many issues about it such as hackers.


Chris Wood - 4/12/2010 10:35:46

I thoroughly enjoyed both articles in this week's reading. The article about making computers so naturally linked to human cognition that interacting with them would evoke the same feelings of relaxation that does a walk in the woods made sense to the point of being disturbing. This, however, cannot be done overnight. Any user interface is going to take some getting used to, but over time it will become second nature (maybe eventually first nature). We are so relaxed by a walk in the woods because humans have exposed to this interface with the world for thousands of years. Give computers another thousand years and I'm sure that the computer will be ubiquitous and generally seen as part of nature and part of the world rather than their own entities with their own universe (the internet). Also, the article on new technological trends and parsing the daunting amount of information presented to us was very smart and well written. I loved the part about Mendel's discovery of genetics and its lag for gaining general acceptance because nobody was able to grasp and expand upon it.


Vidya Ramesh - 4/12/2010 11:19:14

The author discusses the value of being able to manipulate ideas and store them for the record. He remarks that if the scholar can only understand the most recent work, he can only be left with very little time to compose and create breakthroughs. However, he then points out that man has built a civilization so complex that he needs to mechanize and record his history so not to be overtaxed. He concludes the article with the articulation of a deep desire that man continues to strive to learn how to wield his history for true good and not stop before this is achieved. The article titled "The Computer For the 21st Century" discusses ubiquitous computing. The author defines this term as the ability of the computer to become an everyday, commonplace, abundant resource. By allowing computers to become like this, the author promises that individuals using these computers will become more aware of the people on the other end and forget about the technology that allows for that communication. He finally concludes the article with the idea that "Machines that fit the human environment, instead of forcing humans to enter theirs will make using a computer as refreshing a taking a walk in the woods".


Geoffrey Wing - 4/12/2010 12:48:28

The reading by Vannevar Bush, "As We May Think," is especially interesting to me because I have learned a little about him and his "memex" system in another class. His "memex" system is an early precursor to the world wide web/hypertext systems. His article is as applicable back then as it is today. He states that the human mind operates by association, which makes hypertext a solid system. Each article is not ordered in any particular way, and you can easily jump to another article by clicking on the corresponding link. Today, as we have create more and more information, managing it becomes increasingly difficult. As only one person has to access a large amount of information, it is especially important for UI design to allow for quick and flexible retrieval (the objective of the memex system).

I also really enjoyed the second article, "The Computer for the 21st Century." The author brings up an interesting perspective that I had not thought of before. Computers are definitely confining - you are just focused on a small box. Also, the idea of having computers everywhere is not too far off. We carry mobile phones with us almost everywhere we go. Advertising/signage is becoming digital and dynamic. Pretty soon, we'll have to think about UI design for everyday things like clothing.


Wei Wu - 4/12/2010 13:22:53

One large problem in 1945 that the author of "As We May Think" presents is the retrieval of information that already exists -- he says, "...we can enormously extend the record; yet even in its present bulk we can hardly consult it." His solution for this problem on an individual basis is a "memex," which is alarmingly reminiscent of the modern-day computer. Of course, his vision of the memex is relatively narrow compared to the tasks made possible by computers today. The user of the memex can only flip through documents that he has saved on his computer, while Internet today makes virtually all information in the world readily available to any individual in seconds. The use of multiple levers and buttons in the memex UI also seems clunky compared to our use of the keyboard and mice for control.

Still, it is remarkable that someone was able to imagine the future so accurately. This shows that for certain usability problems faced by users, there is only one superior solution. In this particular case, the personal computer was the answer to the issue of information retrieval.


Charlie Hsu - 4/12/2010 14:15:58

"As We May Think" foreshadows many of the problems we address and create applications for today. One of Bush's primary concerns is "the record," how humans use science to augment their memories. Not only do we need to write to memory, but we also must be able to read and efficiently sort through memory. Bush talks about photography as writing to "the record," and the goals of the advancement of photography he came up back in 1945 still apply today! (dry photography, compressing photography [digital]). His criticism of data input via figure-writing, and alternative solution of positional configurations, are things addressed even today, as seen by the Palm Pilot corner-tracing input. Database management systems like SQL today mirror his views on the future of selection ("produce a list of all employees who live in Trenton and know Spanish" sounds just like a SQL query!), and Wikipedia's liberal use of links within articles shows the implementation of an associatively based selection process.

Weiser's article on ubiquitous computing seems to reflect certain developments made today as well. The RFID wine rack UI we observed in class is a good example of computers integrating into the human environment. The privacy concerns were prominent in my mind from the very beginning of the reading, and I was glad he addressed them at the end, but I was ultimately unsatisfied; ubiquitous computing may seem useful in a benign environment such as managing a personal wine rack, but what about with more sensitive information such as credit card numbers or identification?


Alexis He - 4/12/2010 14:26:06

In the article "The Computer for the 21st Century", I find it apt that we're reading this just as next year will be the 20th year after the initial prediction that "what we call ubiquitous computing will gradually emerge as the dominant mode of computer access in 20 years." There are so many similarities between the "iPad" and the "pad" in the article that leads me to suspect that this article may have been a large influence on Apple's engineers. For one, the bottom picture in Figure 6 looks eerily familiar to the icon interface for the iPad. Another, is the analogy between an iPad and a paper pad. Although the iPad isn't as dispensable as paper pads, it's the same size and meant to be used in conjunction with larger computers (it's not a substitute for a desktop or laptop). And lastly, iPads require a network to download its content (apps). This highly networked environment is what ubiquitous computer relies on to "communicate with most workstations".

When I first heard Apple's announcement that their next device was named the "iPad", I had many misgivings. But after reading this article, I now understand the analogy of traditional paper-pads and possibly what Apple is trying to create.


Richard Mar - 4/12/2010 14:44:36

In the 'Computer for the 21st Century' reading, the most-mentioned device is a 'tab' device. Although a lightweight multipurpose device is a nice thing to have, there is the issue of supplying power and communicating with the device. Without wireless power, the pad needs to be docked or have its batteries replaced every so often. Wireless communications are fine, until there are fifty to one hundred different devices, all trying to communicate via wireless signals. Unless those two primary hurdles are overcome, the tab-based scenario described in the reading will not be feasible for widespread use.


Conor McLaughlin - 4/12/2010 15:01:42

I really enjoyed both the Weiser and Bush readings. It's always fascinating when somebody on the level of intelligence as Mark Weiser accurately predicts how the world should proceed for the greatest benefit of humanity, and then how it does so. We are already beginning to see the effects of ubiquitous computing as laptops are carried from class to class and iphones now give instant access to the information web through an incredibly complicated network of technology. To the everyday user, however, they are just accessing their phone and using it to check e-bay in a standard method of interaction. The entire time I was reading Weiser's article, however, I could not help but feel the ominous tone that something like location awareness carries with it. Security of identity and information is of paramount importance to both individual freedom and that of intellectual creation. Weiser said with properly implanted encryption technology such a world could be even more secure than our own, but Computer Security classes, and even the latest news, will tell you as security gets more and more advanced so do the hackers methods of circumventing those in-place securities. By making computing ubiquitous, the level of damage an aggressor could enact gets more and more catastrophic, and I think Weiser did not provide sufficient detail to address this ethical side of computing. Our goal as ui-designers is to make technology invisible and natural to the users, but an area of Computer Science not often explored is how far technology must go to aid the individual. At a certain point, individual responsibility for tasks must be given to the user, and I believe the future of the industry is coming to a crossroads were such ethical decisions will come to the forefront of discussion.


Victoria Chiu - 4/12/2010 15:03:09

Ubiquitous computing is bringing computers into daily lives that people can barely realize that they are everywhere, just like street signs. Conventional settings of a desktop computer will not be appropriate for ubiquitous computing. For example, we do not assume any software or hardware configuration will change when a personal computer is running. But in terms of ubiquitous computing we would not want to shut down every computers in the room to install a new software on one of those.


Bobbylee - 4/12/2010 15:03:55

"All our steps in creating or absorbing material of the record proceed through one of the senses—the tactile when we touch keys, the oral when we speak or listen, the visual when we read. Is it not possible that some day the path may be established more directly?" in As we may think by Bush, suggests that he has a really good insight about how human might receive/create information in the future. Maybe in one day,we don't really have to see the real product in order to visualize it. We might just need a device to emit some electrical vibration to our brain, our processor. Then we can already see the things. So it is like the standard input of the computer, which is the keyboard. But now, we input from places other than standard input. I really believe that it is a very good insight of human might create/absorb messages in the sense of transmitting the electrical vibration. Who knows, in the future, we might be able to translate our feelings to signal and to transfer to some other people. Such that we share the feelings.


Divya Banesh - 4/12/2010 15:10:08

As I read about the history of inventions, and of ideas the author has of future inventions, I can see the progress of innovation. The author talks of different ways to record media, from stone to paper to signals, moving through wires. From a UI perspective, this makes me realize that no matter what the norm is now, the may change in the next five to ten years. Today, the mouse and keyboard and the standard forms of input but touch devices and motion sensors as well as speech input have already become very popular. Soon, the mouse might disappear, and users might user motion detection with touch sensors to 'click' on a button instead.


Hugh Oh - 4/12/2010 15:25:18

Weiser explains that as technology gets continuously more sophisticated, people will progressively be less aware of technology which places even greater emphasis on human psychology. Previous readings have really emphasized the relationship between the user and computer but Weiser really emphasizes the importance of the lack of relationship between the user and computer. Weiser argues that computers should be designed around the human psyche such that computers are so intuitive that they meld into the background. Interesting way to approach the problem of HCI.


Dan Lynch - 4/12/2010 15:31:04

As We May Think The author of this article poses some very interesting issues in regards to the amount of information and technology we face in this day and age. The author goes into details in what regards we have benefited from certain technologies, and how before a certain level of advancement, we could not have allowed for vast use of the newest technologies. For example, when an early conception of a calculator couldn't be released to the public, because its development and maintenance costs far exceeded anything that was feasible. In that sense, technology is what has allowed other technologies to exist for the public, and in many cases, exist even in research. A major theme of this paper is transmission of information, and in addition, the encoding or compression of information for later retrieval or decoding. An example discussed is the stenotype and vocoder combination. One device is used to convert human speech to symbols that can later become converted to text. This is the stenotype, which is done by a human. However, the combination of this device with a vocoder is much more powerful. The vocoder strokes the keys for you based on the frequencies of the sound it is receiving, hence, the human can be eliminated from the stenotype. The topic of indexing is heavily discussed, which provides fast mechanisms for accessing otherwise hard-to-find data. There are different methods for indexing, some are basic like alphabetic, and others can combine associations and categories.

The Computer for the 21st Century The idea of this paper is that one day we will be completely oblivious to the technology around us, and merely absorb the information that needs to be transmitted from the technology to our brains. This notion they call, "embodied virtuality". A fairly dated paper by this point, discussed a hypothetical story of a woman named Sal. Throughout her everyday interactions, she interfaces with multifarious technologies as if they are normal human interactions. They go "unseen". However, the reader takes into account by the fact that these devices may not really be completely embedded into our daily lives just yet. For example, the automatic coffee maker does exist, however, the extent to which it was wired and programmed is far beyond what anyone typically does today.



Nathaniel Baldwin - 4/12/2010 15:33:29

The first article was a fascinating read. I'm still sometimes in awe of how much technology has advanced since I was a kid in the early 1980s, but taking a step back to look at the state of technology when my father was a kid - really, not all that long ago on a grander scale - was a trip. Not only was it interesting to hear the "current" technology of 1945 discussed, but more interesting was the view of the future from that point in time. The author was, overall, almost eerily prescient - perhaps best encapsulated by an early quote: "The world has arrived at an age of cheap complex devices of great reliability; and something is bound to come of it." Of course, the specifics of the future technologies he described were almost laughably inaccurate, but the general ideas he put forward, and his emphasis on the accessibility and cohesion of accumulated human knowledge was spot-on. (Also, he acknowledged, towards the end of the article, the likelihood of yet-unimagined technologies changing the specifics that he described). A fun read.

The second article I found similar in general tone, and was an interesting contrast to the first - visions of the future from 20 years ago instead of 65. In some ways, the article was spot-on: computers are becoming more and more ubiquitous, and they come in a variety of sizes for different tasks. I wasn't totally sold on the tab/pad/board idea - particularly the idea of having separate physical objects to represent separate digital items / ideas. Perhaps I was misunderstanding, but as I read it they sounded wasteful and unnecessary. I like that I can pick up a digital "idea" - a file - and move it easily from place to place with a small object (a flash drive). But it's often easier to share just over the internet - which, in fairness, was nothing like what it is today when this article was written in 1991 - even the world-wide web was in its extreme infancy. I was glad that the article gave a nod to privacy at the end, since the rest of the article raised some serious concerns in my mind, especially the description of the lady's home computers having recorded her neighbor's movements - yikes! I think they could have been a little more concerned, though, than they seemed.


Richard Lan - 4/12/2010 16:09:39

Science has not caused the way we think to evolve because it mainly focuses on advancing our knowledge of material entities. With all of the advances in knowledge, humans have become bogged down by too much information. The author uses the example of the glut of research papers in the academic world, where only a minute fraction of them have any relevance to the topic one wants to pursue. Therefore, it takes forever to find the information that is important to one's studies. To remedy this problem, Bush suggests that data storage devices can be miniaturized. He also proposes a direct manipulation device, called a “memex”, which allows users to group information by association, rather by the traditional method of indexing. The argument for this is that the human mind operates by association, so an effective machine will try to mimic the way the mind actually works. Bush describes actions such as “tapping on an object to view it” and “levers for taking photographs”. The point of building this machine would be so that the user would have tons of information and would be able to access pieces of it faster than if they were taking a book off a shelf. The description of this machine foreshadows the ubiquitous computers talked about in the second article.


Owen Lin - 4/12/2010 16:11:29

It's really amazing how people predicted the age of computers many decades ago, and just now computers are really becoming a prominent part of everyday life. We are just now seeing ubiquitous computing in the form of laptops, netbooks, and smartphones. On campus you'll often see people playing with their iPhones and Blackberries or chatting with friends on their laptops. In fact, anywhere you go, chances are there is a computer of some form nearby. It is amazing to realize that people saw the need for a better way to organize and catalog humanity's progress as a whole than print and paper (as shown in the 1945 Bush article in the Atlantic Monthly). And in 2010, we can easily see the Internet (the Age of Information) as the solution.


Kathryn Skorpil - 4/12/2010 16:21:47

In "The Computer for the 21st Century" we are given an idea of what it would be like to live in a world where computers are everywhere. I think that though our world would be much more efficient, we would also become really lazy, at least lazier than we already are. Computers are amazing and do make our lives easier, but I definitely think that it has hindered our intelligence and certainly our ability to interact with other people. I hope that a day does not come where we will rely on computers for everything the way that the reading tells us we will.


Long Do - 4/12/2010 16:21:50

It's really interesting to read how they two authors saw the future technology. The view of the "The Computer for the 21st Century" article seems much closer to our current status and the way we are heading since it is more recent. The ability for several machines to have very simple operating systems but are connected to each other and can exchange information is approaching. An example would be the fact that I can be listening to music on my iPhone, walk to my car and have it sync up using bluetooth so that I can listen through my car's speakers, and then be able to handle a call using those same speakers. The "As We May Think" reading is interesting in highlighting how far we've come and how reliable we have made machines. I also saw that many of the predictions it made have come true, such has using bone-conduction for sound transfer. It is amazing how we have expanded our society through the introduction of machines and how they have changed us and opened us up to new possibility and healthier, more efficient lifestyles.


David Zeng - 4/12/2010 16:29:22

There are two obstacles on the way towards ubiquitous computing in our lives. The first is the integration of the systems used. The article says that we need a central hub to relay and transmit information. With so many different types of OS's in today's world, it would be difficult to force compatibility between then and even more so to convince everyone to use the same one. There are both financial and technological reasons why people would want to avoid this setup. The second challenge is a lack of alternatives. If in fact computers were used in every part of our lives, then we would need a backup in case it failed. In the case of writing, we can write on many surfaces with many different tools in case we lack access to a source or our pencil breaks. However, in the case that a computer breaks, the common knowledge necessary to replace or fix what has happened is definitely not there. Unlike writing, teaching technology or even basic computer proficiency is a difficult task. While we are trying to make computing a basic part of the educational process, it will still be a long ways before we can achieve the learnedness that people today have with writing.


Andrew Finch - 4/12/2010 16:33:20

Weiser makes a very good point in "The Computer for the 21st Century" that I hadn't realized until reading this article. Today--almost 20 years after this article was written--we tend to think of computers as items that have been very much integrated into our lives. While we have certainly made some significant progress since 1991, we still have a long way to go until computers become integrated at the seamless and ubiquitous level that is described in this article. Computer monitors are bigger but still fairly small for a true "desktop" simulation (only 20-24" diagonal is standard now), we still use paper a lot, RFID isn't used in nearly as many places as it should be, and the list goes on. Even in 2010, we don't have computer terminals that recognize who the user is automatically and apply the approprite settings, as Weiser envisions. We have many (too many) ways of transporting data from computer to computer, but nothing as simple and intuitive as dragging a window from a computer to a handheld device, carrying the handheld into another room, and then dragging that window from the handheld to another computer. This level of integration and seamlessness is something we need to focus more attention on, as it has great potential for improving our relationships with machines and information.


Mohsen Rezaei - 4/12/2010 16:45:58

Generally, both readings point out how technologies have come in handy and have made human lives easier, faster, and much more maintainable. Comparing old technologies and the ways people used to perform their tasks to the new technologies and how people perform their tasks now, we realize how implementing and designing new and helpful systems help us in our daily tasks. This way for example we dont have to wait for a photograph to be printed, by having digital cameras. The new ways of doing things in life guarantees a maintainable life. On the other hand, there are people who dont like the way new technologies are taking over this world and change the way human being used to live his life. But having this as a fact, researchers have concluded that "using computers and new systems, which help us in our lives, will be as enjoyable and taking a walk in the woods."


Brandon Liu - 4/12/2010 16:47:32

I found it interesting that the article by Vannevar Bush describes speech-to-text and text-to-speech systems. In the 1940s, he probably considered these are the most usable interfaces for interacting with machines. However, even in 2010 these kinds of interfaces do not have widespread adoption. In the SciAm article from 1991, the author describes a similar situation: " her alarm clock, alerted by her restless rolling before waking, had quietly asked "coffee?", and she had mumbled "yes." "Yes" and "no" are the only words it knows." This article also mentions: "She wipes her pen over the newspaper's name, date, section, and page number and then circles the quote. The pen sends a message to the paper, which transmits the quote to her office." My point is that most of these "prophetic" accounts of interfaces of the future rely on currently known interfaces, that is, writing/speech, rather than something more artificial such as keyboard input. Even today's "fantasy representations" of user interfaces involve technologies like these. Based on our studies in class, these kinds of technologies aren't actually any more efficient than the input methods we're used to.


Arpad Kovacs - 4/12/2010 16:48:30

"The Computer For 21st Century" provides a fascinating vision of ubiquitous computers, which are invisibly interwoven into our world and enhance reality. Rather than multimedia computers which must be carried around and become the center of attention, the proposed system of embodied virtuality would rely on tabs, pads, and boards which have no individualized identity or importance, and like scrap paper can be grabbed and used anywhere. Since the time of this article, technology advancements have removed most technical obstacles to ubiquitous computing: wireless communication (RFID/ZigBee/802.11), processors (ARM/MIPS/MSP430), and displays (LCD/LED/E-ink) have fallen in cost so much that they are appearing in almost every conceivable consumer durable goods category, from your car to your camera. Likewise, you could view your Cal Student ID, Calnet/Google/OpenID login, or even a USB stick as the manifestation of an "active badge" that lets you open doors or access your data from any platform that can run a web browser. All that is missing is the integration of these various components into one unified network (the Internet?), and a distributed operating system that can share information between these various devices (although it is likely that Microsoft, Google, or academic researchers have already found a solution for this). But the question yet to be answered is user buy-in: Are people willing to trust and share these ubiquitous computers so much that they become transparent and freely available, or are the notions of private property and individualism too strong for us to give up our "personal computers" and cellphones? After all, even today we don't have "ubiquitous textbooks" or "ubiquitous notepads", so why should it be any different for computing devices?

"As We Think" is an amazingly prescient prediction of the future, considering that it was written in 1945. With some liberal interpretation, we can credit Rush for accurately foreseeing rough analogies to voice recognition (Vocoder + stenotype), databases (central information retrieval), telephone switching circuits, and digital photography (scanline-based dry photography). I thought that the concept of the memex, an associative indexing trail that reminds me of the hyperlink system used in HTML (and in particular Wikipedia), was particularly insightful. The key takeaway I got from this article is that there is a lot of published information out there, and we as computer scientists have a duty to organize and make this knowledge accessible to the world, in order to continue paving the way for scientific breakthroughs.


raymondlee8888@gmail.com - 4/12/2010 16:51:30

The description of pads in this reading reminds me of some fictional, concept, and actual technologies. Star Trek: TNG depicted pad-like computers called PADDs that could display many different types of information and appeared to completely replace paper functionality, and each PADD did not seem to have much individuality or importance because they seemed to be freely passed from officer to officer.

I've also seen concept paper-thin displays that could even be rolled up like a newspaper (it was ideal for displaying newspapers). One big step toward achieving that tech is Amazon's Kindle, which can display newspapers and books and seems to fit the "desktop" metaphor better than a laptop. And of course the recent iPad seems to be advancing us even more toward the idealized pad as discussed in this article.


Joe Cadena - 4/12/2010 16:57:35

I enjoyed reading an article from the past that envisions a future of ubiquitous computing where every day objects are replaced with micro-computing devices. But one claim Mark Weiser makes is that in order for computering to disappear into American society, they have to be abundant and treated like their original counterparts. Although I agree with his general idea, I disagree with his example claiming "pads" should be strewed about a work desk similar to how notebooks currently are. I believe a future of ubiquitous computing combines tasks with objects into similar devices thus reducing the number of single-function objects and increasing the availability of solutions.


Calvin Lin - 4/12/2010 17:05:23

Seeing that the Bush paper was written in 1945, I think much of what Bush was hoping to see with technology in the future has come true. Today, the web is dominated by search and retrieval of relevant information pertaining to a user’s desires and characteristics. Bush talks about an association of thoughts, which instantly reminded me of seeing “Related searches” on search engines, and links to other related articles on the thousands of pages I’ve visited. In addition, there are many sites that allow users to customize content based on personal interests. Database systems such as MySQL allow users to quickly look things up and search among a vast bank of information. Overall, the internet and other technologies can be a tool for cognitive tasks, enabling people to go beyond their own natural/physical limitations.

Although just written in 1991, the Weser article was also a reflection of recent trends in technology development. When you think about it, practically everything these days runs on computers and these systems have become so deeply integrated that you don’t really notice on the outside because we take it for granted. For example, most people don’t think about how the internet works, how the traffic light system works, or how banks keep things synchronized across thousands of ATMs. Weser describes the PC as a “different world”, but today I would argue that we have fused the two together. Personal computing has adjusted to fit our daily lives, but we have also adjusted our lives to fit with the digital world, as millions of people rely on computers for so much every day.


Wei Yeh - 4/12/2010 17:16:41

Although the idea of ubiquitous computing is incredibly exciting, if one thinks about it carefully, it is also very scary. Technology has helped make our everyday activities more efficient, but at the same time, it has also made us dependent on it and thus very lazy. What scares me is a future where a person can sit on a couch all day long and get everything he needs to do done with the help of all the computers around him. Being alive would be really boring.


Andrey Lukatsky - 4/12/2010 17:22:32

I was quite surprised to see the accuracy of the predictions made by the author of As We May Think - a piece of writing over half a century old. He seems to have predicted the existence of the Internet itself. I had similar feelings about the Scientific American article as well. It seems like the authors of both works truly understood the social nature of our lives, and predicted systems that would serve to enhance this.



Weizhi Li - 4/12/2010 17:24:59

The article "As We May Think" is wrote by Bush about 60 years ago which seems out dated but it's amazing that a lot of Bush's prediction have become reality. It's very interesting to see that those ideas in the paper of Bush's book actually becomes what we use everyday in reality. Bush's article accurately predicted the development of Internet, voice recognition. One thing I noticed is the article also describes society changing to meet the needs of the technology, for example, developing languages that are conducive to recognition. I don’t know if we have yet developed a research web, where individuals can form trees depicting their train of thought.


Saba Khalilnaji - 4/12/2010 17:29:43

According to the article, Ubiquitous computing allows computer to fade into the background. It becomes something that is used in everyday life without having to actively think, "I'm going to use a computer". Its like the power sockets in the walls; one doesn't actively think about the wires in the walls or the current that has to flow when you plug your vacuum in. You subconsciously know that part of the process of using a vacuum involves plugging it in, and turning it on among other things. Its been integrated into everyday life. Computers are getting closer to such integration, however I do not understand how ubiquitous computers pose no barriers to human interaction. Computers themselves require some sort of attention and thought process to be used for most tasks. How can the same device change in way the removes such concentration to better allow social interactions between colleges?


Richard Heng - 4/12/2010 17:32:40

In Bush's article, I particularly liked the notion that the mathematician is not necessarily a master of arithmetic. Matrix multiplication becomes a powerful abstraction that he utilizes. This, coupled with the fact that the author believes that specialization is inevitable, makes it seem inevitable that the future will be dominated with low level languages, and abstraction will be come increasingly more prevalent and powerful. His writing is prophetic not only in the physical items he names, but the concepts behind why they develop. These concepts are general, and I believe they will continue.


Linsey Hansen - 4/12/2010 17:32:43

So, while I'm sure that the Bush article has quite a bit of historical significance as far as inspiring good user interfaces goes, I personally didn't enjoy it very much. I felt like it was just a long way of stating that by optimizing tools and a user's interaction with them, the overall performance of the user is greatly increased, which seems repetitive to me after being in this course for so long. However, I really did enjoy the Weiser article, mostly since it was written in the early 90s, but looking at technology today we are slowly moving towards some of the devices mentioned. I guess we technically don't have anything too 'tab' like aside from smart phones, but they are getting to the point where they can be used for tasks such as authentication, sending personal information, and tracking people. Then the 'pad' made me giggle, mostly because of the iPad, though I feel like the concept of having multiple pads, each with different information on them at once seems cumbersome. However, pad-like devices could definitely find uses in places such as hospitals for functions like approaching a room would automatically download the resident patients medical sheet or business meetings for tasks that a computer would make too complicated (to the point where it isn't worth bringing one) such as sharing documents and stuff (ie it would basically replace stuff normally kept on clipboards). As for the board, I mostly though of professors who use tablets along with projectors. I mean, if a professor had something 'tab' like that could download the professor's slides on one side of board, then the other side could be a digitized and savable workspace (for working out equations, drawing diagrams, etc) that could be uploaded along with lecture notes, that would be pretty awesome, however, it would also be really really expensive, unless you wait a while for technology to catch up as Bush described in his article (though if you think about it, it would be taking the cost of buying things such as tablets, which can often be up to twice the price of non-tablet laptops with similar specs, away from various professors over the years, so it might even out in a really long time).


Mikhail Shashkov - 4/12/2010 17:34:04

Some of the more specific discussions are a bit over my head (don't really know too much photography jargon) but it was interesting to get a historical perspective and see what people predicted for the future back then.

As for the second article, it scarily sounds like Brave New World or the computer screen walls of Fahrenheit. Anyway, I'm not sure it is possible, in the near future, for computers to become fully embodied in everyday life. There will simply always be the sensation of interacting with a machine when clicking, touching, or even viewing a screen; there will always be a different between viewing a screen of any kind and viewing the real world. Perhaps in the distant future when devices can become incredibly miniaturized and be placed into complex shells, or even "real" items with this process actually come to fruition.


Yu Li - 4/12/2010 17:37:45

This article is correct in pointing out that the many advances scientists have accomplished tackle the problems that involve man's physical strength, but not mental strength. Instead of focusing attention on material things, maybe scientists should turn their minds to creating things that further knowledge and help people access their mental capabilities. By doing this, I believe it will better benefit the world than the physical helping instruments we have so far.


Peter So - 4/12/2010 17:40:24

Embodied reality may appear at first to be great as in the article by Mark Weiser, but when computing becomes ubiquitous and no longer a conscious thought of the ordinary person you are essentially changing the way people interact with the world and one another which in turn affects a person's quality of life. As technology continues to develop focusing on how to best serve the needs of man, people become that much more dependent on technology to survive. Looking at the example of improved transportation technology, many urban dwellers have lost sight of how to survive on their own. A typical city person would tell you he got his food from a grocery store and would have very little if any knowledge of where food came from and how it was made. If a disaster were to occur and food could no longer be delivered from the farms to the cities, many city people would starve. Expanding this example of dependence to the company realm, if people become overly dependent on computers, when the power goes out or the system and network crashes each individual is lost and helpless on his own accord. I am not abandoning the idea of embodied reality. I think it will be great to explore what great things the human mind is capable of once today's daily tasks take little to no time and a person is allowed to think freely with more time. However, it is our responsibility as living beings to know how to survive and although technology continues to become a larger part of our lives we must still provides the means of basic survival to avoid our own self destruction.


Jeffrey Bair - 4/12/2010 17:40:49

It is interesting to see how much technology has made an advancement on how we produce images and text and how we store them as well. From times long ago when images were difficult to convey and took up a lot of space and now they can fit on small chips with bits of data and can be reproduced almost immediately with computing power. The leaps in technology have also affected our every day lives even for the general populace and affects our social nature. Rather than meeting people in real life, social interactions online are becoming more and more prevalent. Finding information is no longer a hassle of having to look through a physical medium but can be found with a simple search through the internet. It's also interesting to see how the process of how we make technology model our own interactions in real life. For example, from before, we had books which were a physical medium and were stored forever on a page. Now we have computers which can access information that is stored on a temporary medium much like how our brain may retain information for a short amount of time and then is stored back to the long term memory or simply forgotten, like writing back to a hard drive disk or wiped.


Spencer Fang - 4/12/2010 17:46:29

The way we use computers now slightly resembles the ideas described in the two articles. They imagined future computers to be ubiquitous, so that our everyday lives would be surrounded by many interactive computers that each perform specific tasks. Together, the articles describe a mode of computing where task specific computers are favored over general purpose computers, and there is a miniature computer tailored for each aspect of our daily lives. For instance, a garage door may have a computerized manual that can respond to a wireless "lost manual" message by beeping out loud.

However, the direction computing technology is going is quite different. We are approaching a world where we have access to knowledge on any topic at any time, but it is not done through ubiquitous computing devices, but instead ubiquitous computer networks. Instead of having computers cheap enough to replace paper documents, we have personal devices that we always carry around, that are capable of retrieving and display a wide range of documents. If a customer loses his garage door manual, a more realistic outcome would be that he visits the manufacturer's web site and downloads a digital copy. He can do this sitting in the driver's seat while parked on the driveway. The idea of handheld devices on high speed ubiquitous networks achieve much of the same ideas of the articles.


Jordan Klink - 4/12/2010 17:50:55

This week's reading was very insightful in getting a historical perspective not just on HCI, but on computing in general. Indeed, in order to prepare for the future one must understand the past. I'm curious to see what lies ahead, and what the reading would look like had it been written today. What exactly is the computer for the 22nd century? Only time will tell, but after looking back at the 21st century, I can safefully say that the future is very bright.


Vinson Chuong - 4/12/2010 17:55:55

This week's reading gives a perspective on what the future might look like last century. It talks about how technology will evolve to allow us to complete our tasks more efficiently. It talks about how technology will evolve to do our tasks for us.

One thing I believe merits discussion is how we will evolve in the face of evolving technology. As we gain more sophisticated tools that allow us to abstract our thoughts and actions to higher levels, does it change our way of thinking and how our minds work? Will certain interface paradigms that are superior for certain tasks now work well for the same tasks in the future--will such tasks even exist? The discussion about technological advancement and ubiquitous computing seems so linear in that our meanings of those terms may change. We may very well come to think about computers as being ubiquitous without significant advancement in technology--if we don't already think so. Our entire concept of "computer" may very well change as well.


Brian Chin - 4/12/2010 17:56:20

I thought the first reading from the Atlantic was very interesting. It was sort of amazing how many of the predictions made in 1945 from a scientist turned out to be relatively correct. In the reading he talks about how information will be stored in a very compressed format. Although, he predicted it to be stored on film, today, DVDs and other formats are able to store an incomprehensible amount of information, for people back then, in a small container. The author's talk of a way to do calculations faster also made me think of both the modern computer and calculator. These are able to make calculations many, many times faster than they could have been done back in the author's time. However, he was correct in predicting that they would run on electricity. I also found his discussion on selection interesting. It seems similar to modern problems of storing and retrieving relevant pieces of information.


Darren Kwong - 4/12/2010 18:00:55

Bush's article made a rather intriguing point about information. It seems to suggest that a deep understanding of human cognitive processes would allow humans to push knowledge and information around at astounding rates. More direct forms of communication would completely change human routines. However, there are definitely limitations due to the nature of human society. The advancement of technology is limited by what people are willing to try.


Jonathan Hirschberg - 4/12/2010 18:01:41

Why doesn't a device like the memex exist today? The memex is constructed to work the same way that our long term memories work. It brings up things that are related by association to what we're looking at, which is just how long term memory works. The reading suggests that something like this would be very intuitive and easy to use. Perhaps it is, but it would need to be built and tested first. How would the indexing be done? What criteria would determine how closely things match? Perhaps our current technology makes it more difficult to retrieve data by association than by traditional methods of alphabetical indexing. Or is the idea impractically complex? One common trend I noticed from these two readings is that they describe idealistic scenarios in which technology can be used to do tasks far more advanced than is currently possible. It is important to resist the urge to disregard these ideas as impractical fantasies and instead think about how the pieces can be developed little by little over time, if it is something practical to be pursued. But whether or not you consider something to be practical or not also depends on current technology. For example, something that requires a really inefficient or complex algorithm right now, such as something that's O(2^n) may not be so bad in the future if research can produce a more efficient algorithm that cuts down the running time to a more reasonable amount. Things that were considered impossible before would then become possible once that happens.


Long Chen - 4/12/2010 18:02:48

Sidenote: When I first opened the article and the Editor's note stated that Vannevar Bush held a position in the government, I automatically thought he was a part of the Bush political family tree. I realized I was wrong once I saw the article was written in 1945. Nonetheless, I wikied him right away and started several interested wiki branches. He is a primary influenced of Ted Nelson, another famous figure in his own right, and his ideas also shaped "Silicon Valley" and the World Wide Web.

Vannevar Bush made a great point about the "growing mountain of research" because the amount of innovation and creative thought done in the universities and laboratories around the world is just astonishing in the 21st century. To think there was already a mountain of research in the 1940s would be saying there is a planet's worth of research now. Specialization has led to great inventions in specific areas of research, but has also made understanding and acceptance of some great ideas impossibly hard to be put in production. Mainstream understanding of the kinds research conducted at a research institution such as Berkeley is so limited that the outcomes are under appreciated due to the vast divide between the everyday person and the intellectual. The article written by Bush must have been influential right after America just won the war. But it may help the current generation of researchers and scientists to pay the article another visit and gain a historical perspective that may be applicable to the current environment

"The Computer for the 21st Century" article had no dating and I kept wondering when it was written. A direct quote from the article: "The next step up in size is the pad." I wonder if the iPad was something they had in mind. Personally, I believe the iPad offers the first step in ubiquitous computing in that the convenience and portability of the device easily integrates into the user's everyday life. It's not necessarily a working tool or an entertainment device, but it's a one-size fits all device that smoothly fits into everyday tasks. I find the concept of ubiquitous computing extremely exciting and in particular the UI aspect. This class is definitely a nice and useful first step into designing tools and models that simply people's daily lives.



[add comment]
Personal tools