Collaborative Visual Analysis

From CS294-10 Visualization Sp11

Jump to: navigation, search

Lecture on March 2, 2011

Slides

Contents

Readings

  • Voyagers and Voyeurs: Supporting Asynchronous Collaborative Information Visualization. Heer et al.(html)
  • CommentSpace: Structured Support for Collaborative Visual Analysis. Willett et al. (html)

Optional Readings

  • Design Considerations for Collaborative Visual Analytics. Heer and Agrawala. (html)
  • Many Eyes: A Site for Visualization at Internet Scale.ViĆ©gas et al. (pdf)

Julian Limon - Mar 01, 2011 11:32:28 pm

I really enjoyed Wesley's lecture about collaborative visual analysis. My research at the I School deals with collaboration in organizations so I was very interested in learning the findings of tools that facilitate collaborative data analysis. I think Michael really brought up a great point in class: One key difference between collaborating within an organization and collaborating in the open web is the matter of incentives. People in an organization or established group usually have a real stake in the outcome of the analysis task. Moreover, their identity is persistent and traceable. In the open web, this is rarely the case. That's why I believe that a lot more work needs to be done to be able to tap into the wisdom of the crowds for data analysis. I feel that tools like CommentSpace might enormously benefit from a Games With a Purpose (GWAP) approach. If it is possible to divide the exploratory activity into very granular and fast activities, users in the open web may be more likely to spend a couple of second tagging a comment. For example, if they were presented with a small version of the graph and with three tags (evidence-for, evidence-against, unrelated) to select from, they might help classify and explore a larger data set. I envision a Captcha-like system where users are challenged with a small task to prove that they are human (of course, there would need to be a countermeasure to make sure the answer is right). Sites like data.gov or Wikileaks could then get a lot more visibility and application.

Dan - Mar 03, 2011 12:32:20 pm

Comment Space and the Voyagers and Voyeurs paper was really interesting. I like the new persecutive cast on interactive visualizations as a collaborative social space. This opens up many doors, as crowdsourced opinions are generally more accurate than one person working alone. In the context of transmitting large amounts of data through the human visual system, it is not always clear which method of displaying information is most efficient, as we must balance rules and aesthetics. Sometimes a single individual may be weighted or biased more towards one end of the spectrum. The hope is that crowdsourcing can improve and optimize this information transmission of large amounts of data in the context of creating visualizations.

Particularly, sense.us was a system that enabled users to discuss and annotate graphics collaboratively. Arrows, labels, and shapes were allowed to be used as annotations by users, and users can also upload their own data. Arrows were used the most, text second. All the other shapes, ovals, pencil, lines, and rectangles were used much less, with rectangles used the lowest.

I think a key to the success of this as they mention is the application bookmarking. That is, the GET parameters are placed in the url so that the visualization can be sent as a URL. This makes the data and visualization more accessible, hence, more usable.

Krishna - Mar 03, 2011 04:13:39 pm

CommentSpace is an excellent interface for collaborating on visual analysis of information. The feature I like the most is that, unlike many other collaboration frameworks, CommentSpace seems to be domain independent. My understanding is that as long as a visualization can be expressed in a RESTful way, it could be used within CommentSpace.

A quick thought: When a user wants to respond to a hypothesis (or) comment to provide evidence for or against, the user generates a visualization, annotates it and links the generated visualization to their response. Although the visualization and the annotations in it would suffice to explain an user's arguments, it may not fully describe how the visualization was generated. For instance, generating certain visualizations might involve complex filtering or transformation steps and these steps might break certain assumptions made by the initial user who posted the hypothesis. It would be interesting if visualizations could support a replay button that would narrate the generation process, and from my understanding CommentSpace framework can easily adapt to such re-playable visualizations. I believe, this feature would further enrich the capabilities of such collaborative visualizations for scholarly analysis.

Brandon Liu - Mar 03, 2011 05:41:47 pm

Many of the previous lectures addressed how visualizations can be constructed from data; these papers are interesting since they address the other half of the question, which is how information gets from the visualization to the viewer.

One aspect I would find interesting is real-time annotation of visualizations that change over time. For example, the US electoral map ( http://elections.nytimes.com/2008/results/president/map.html ) changes over time as more results come in - I remember in 2008 constantly refreshing the page to try to notice new patterns in votes. For a system like CommentSpace to support this, it would have to keep a history of each change as it was made. It would be interesting to link this data (if it existed) to election liveblogs and sentiment tracking/auction systems to see how interpretations of the data changed over time.

Matthew Can - Mar 03, 2011 04:48:34 pm

In class, Wes mentioned a new project he's working on that attempts to crowdsource collaborative visual analysis. This is an interesting problem because it forces us to really think about the workflow underlying this complex process and how it can be broken down (of course, there might be many potential workflows, and that's something to explore). It would also be of interest to learn what parts of the process can be parallelized and where there are dependencies.

As I read through the Introduction section of the CommentSpace paper, I questioned why the system uses a small, fixed vocabulary of tags. I thought that limiting the expressiveness in this way would make the system less effective. But later on the authors argue that this makes the system easier to use since the users don't have to establish a common vocabulary. This makes sense, but I wonder if a hybrid approach could potentially be more effective. That is, have a set of initial tags that are core to visual analysis in general, and allow users to extend that set with domain-specific tags. Also, for the evidence tags, it might be useful to pair them with a degree (a continuum from strong to weak evidence), making it easy to focus on the most relevant evidence for a hypothesis.

The CommentSpace paper made me think more generally about the problem of structuring and facilitating online discussion. Right now, the most prevalent thing out there is the standard commenting system for blogs. The only sort of structure they support is letting users reply to the comments of others (and reply to replies, and so on, usually up to some fixed depth). There's no way to group related comments, to organize them in a way that tells a coherent story of the discussion. Currently, when new users visit the page, they have to read the comments linearly in order. It seems like some of the work in CommentSpace can be applied here.

Jessica Voytek - Mar 03, 2011 08:58:21 pm

Wow Julian, what an awesome idea! It seems like creating an app for any data set might be challenging in the context of a final project for this course, but creating an app that leverages GWOP for one particular data set might be doable. The challenge would be to come up with a game people would want to play while still producing good results. We should talk more about collaborating on a final project.

Sally Ahn - Mar 04, 2011 12:54:47 am

The Voyagers and Voyeurs paper describes their findings on social data analysis from their sense.us website. I think opening up visual analysis to asynchronous collaboration is a great idea with much practical use. It shifts the focus from visualizing existing static data to visualizing dynamic data annotations and creates many interesting possibilities, such as Julian's idea of incorporating GWAP to collaborative visual analysis. This is an interesting idea, and it made me wonder if the process of breaking up larger data analysis into appropriate mini-tasks (possibly in the form of a game, as Julian suggests) could be automated in some way. I'm not sure how feasible it would be to generalize such automated methods across different datasets, but it would be an interesting question to explore that may lead to a useful crowdsourcing application. I was also impressed by the CommentSpace interface. I agree with Matthew that the existing interface for blog comments could benefit from the work done in CommentSpace; perhaps this can be extended to other online discussions such as internet message boards.

Michael Hsueh - Mar 04, 2011 01:49:37 am

I like Brandon's idea of an annotation playback feature for monitoring the progression of activity over time.

I think visual data collaboration is a great idea. It is unquestionably easier to communicate many ideas by referencing a common image configured to display a certain trend than to explain using only words. I think that the key challenge for a community based tool is that it must, in addition to facilitating collaboration, compel users to participate and contribute high quality data / content. This is no problem with a targeted audience, but a general audience might only explore data they are particularly interested in. The easier it is to interface the system with a variety of data, the more likely it will be successful. Both papers seem to recognize this.

It's definitely different, but I couldn't help but be reminded of Google Earth when I was thinking about the annotation concepts. As you probably know, online users of Google Earth can tag random items of interest they stumble upon. In fact, a fairly substantial community of Google Earth explorers have emerged, converging into forums, blogs, and user groups. They are pretty much groups of data voyagers that annotate randomly discovered points of interest on a regular basis. Support structures for collaboration (comment linking, annotation tools, marks) are evidently very important to not only increase the quality of collaboration but also to motivate it. Wes's "Youtubes for data" epithet was apt. I find most Youtube comments hardly worth glancing at, but find myself a huge voyeur on Google Earth. I can spend hours sifting through mysterious and curious findings of other users -- motivated and enabled by good annotation tools combined with interesting data.

David Wong - Mar 04, 2011 06:01:34 pm

I like Julian's idea of bringing a GWAP like feel to collaborative data analysis. I think that understanding who the audience is and providing suitable incentives for the audience is an important thing to consider, which Michael brought up during lecture. However, I also think that creating a suitable structure for collaboration is equally important. Julian's idea of a CAPTCHA-like system to generate knowledge about visualizations is a good example of a granular structure for collaborative data analysis (like recaptcha). In their paper, the GWAP researchers outlined several structures for games that can be employed to effectively leverage the crowd. I think collaborative data analysis can borrow from those structures and can create new ones given different audiences.

Saung Li - Mar 05, 2011 05:10:55 pm

I found the CommentSpace paper quite interesting, as the system helps people get together to analyze data. Following a GWAP-style structure is a very great idea since it can motivate people to make proper tags, create hypothesis, and find evidence supporting or refuting them. Perhaps the drive to obtain and share information is enough for people to do proper analyses that such a structure is not necessary, like with wikipedia. Looking at some of the existing examples, though, this doesn't seem to be the case. It might be helpful if users could submit screencast videos showing how they perform their interactions when commenting. There might be interactions where one has to sift through several interactive options to make a point, so perhaps showing a quick video of this may help facilitate such discussions. With a lot of comments, it seems like the comments section might seem a bit too cluttered. Perhaps the comments could be divided into different topics to assist users in finding what they want.

Siamak Faridani 11:49, 6 March 2011

I enjoyed both papers. I find the idea of cooperative data analysis very interesting. It seems to me that it is what connects HCI and CSCW to visualization. There were interesting pointers in the presentation as well. Here are the points that I took away from both the presentation and papers:

  • By crowd sourcing data analysis one can collect interesting facts about the underlying data
  • People should be able to work incrementally on each other's annotations. This enables us to extract more patterns from the data.
  • A universal URL that stores the current state of the visualization can enable people to share their thoughts on facebook and twitter easily. This brings more people to the tool and more possibilities to see interesting conclusions.
  • Like any other interface a tool for collaborative data analysis should be easy to use and user friendly this is where HCI can come into play

The effort to do the same thing on the whole web was inspiring. What will happen if we enable people to annotate the web? As far as I remember annotation was part of the web protocol that Tim Berners-Lee had proposed but why was it taken out from the whole protocol? why can we not have a universal protocol for annotation? Although there are efforts that try to augment the web with an annotation layer and these all require some sort of a plug-in or proxy layer. It might be more an HCI question but I am wondering what would a universal annotation tool look like and how can we build it?

Karl He - Mar 07, 2011 06:55:38 am

Collaborative analysis is definitely a valid way to find new interesting ways to reflect upon the data. A large part of the problem in doing this, as noted by the guest speaker, is getting people to give input that would benefit the visualization. CommentSpace has a lot of the same problems as sites like Youtube and Reddit, it takes a specific mindset when commenting to provide useful feedback.

I don't think CommentSpace is really heading in the right direction. While collaborative analysis is a good idea, I don't believe the Internet to be the place to seek such analysis, it would be better to look toward fellow students/colleagues.



[add comment]
Personal tools