• ESRI
  • NAVTEQ
  • Veriplace
  • AT&T Interactive
  • DigitalGlobe
  • Google
  • Yahoo! Inc.
  • ZoomAtlas
  • Digital Map Products
  • Microsoft Research (MSR)
  • Pitney Bowes Business Insight
  • NAVTEQ

Sponsorship Opportunities

For information on exhibition and sponsorship opportunities at the conference, contact Yvonne Romaine at yromaine@oreilly.com

Media Partner Opportunities

For media partnerships, contact mediapartners@ oreilly.com or download the Media & Promotional Partner Brochure (PDF)

Press and Media

For media-related inquiries, contact Maureen Jennings at maureen@oreilly.com

Where 2.0 Newsletter

To stay abreast of conference news and to receive email notification when registration opens, please sign up for the Where 2.0 Conference newsletter (login required)

Where 2.0 Ideas

Have an idea for Where to share? where-idea@oreilly.com

Contact Us

View a complete list of Where 2.0 contacts

The Next Wave of AR: Exploring Social Augmented Experiences

Mobile
Location: Ballroom IV Audience level: Intermediate
Average rating: **...
(2.00, 3 ratings)

This panel will discuss shared augmented realities, considering some of the essential possibilities and challenges inherent in this new class of social augmented experiences. The format is presentation of a small set of scenarios (defined in advance, with audience input) describing likely future forms of shared augmented realities at differing scales of social engagement for discussion by a panel of leading practitioners in technology, experience design, networked urbanism, interface design, game design, and augmented reality.

Current augmented reality experiences put who you are, where you are, what you are doing, and what is around you at center stage. But we can already look beyond the first stage of interactions assuming a single user seeing simple arrows and tags indicating POIs, and begin to explore shared (multiuser/multisource) augmented realities.

These social augmented experiences will allow not only mashups, & multisource data flows, but dynamic overlays (not limited to 3d), created by distributed groups of users, linked to location/place/time, and syndicated to people who wish to engage with the experience by viewing and co-creating elements for their own goals and benefit.

Some examples of scenarios could include:

- historical and environmental overlays showing how a city used to be/and how this vision may be constructed differently by different communities

- proposed buildings in communities showing future changes to a structure/neighborhood, and the negotiations of this future

- sensors, both mobile and static can contribute environmental data into city overlays making this kind of data "not back story but fore story,” right where we are, right where it happens, as well as having it available for analysis.

- skinning the world with interactive fantasies

- real time augmentation building

- geo-spatial real time news dissemination from points in a city with time demarcation for information and emergency services

Having invisible aspects of the world made visible will create ways to improve sustainability, social equity, urban management, energy efficiency, public health, and allow communities to understand and become active participants in the ecosystems and infrastructure of their neighborhoods.

A distributed, open framework for AR can enable these layers, datasets, and open interchanges via shared augmented realities to create an interconnected experience of AR that fuses augmentation, data overlays, and varied media with immediate geo-spatial access and, perhaps most importantly, social & collaborative capabilities. Looking ahead, an open framework would allow end users to extend the reach and increase the value of augmented social channels over time.

The challenge of shared augmented realities is not just a matter of shipping bits around, but also of how it we will understand and use these new channels and layers to create and negotiate different perspectives, understand a shared core, or express dissent, in order to build eventual consensus.

There are endless possibilities for distributed, open AR, and the connection of place into an active field of information with end user control…and open options for new layers will have impact across all social scales, from direct conversations, to small-scale collaboration (a product design & build team or a neighborhood fixing potholes), to global a community mobilizing for a cause.

While shared augmented realities in immersive 3D may still be a ways off, new webs of protocol for real time communication like Wave are demonstrating that we can use existing infrastructure and protocols for distributed augmented realities that can allow people to collaborate together on the same world overlay in real time – creating dynamic overlays, animated by time, conditions (see AR Wave ).

The integration of augmented reality with sensor networks, the internet, and the world wide web will create a new opportunities to for us to engage collectively with the complex and often invisible ecosystems that make up our world.

eg.

- interacting/responding/enhancing environmental data
- new connections/understandings between humans that share our world – fish, plants, waterways
- “reading” of places and their data otherwise unseen with shared data allowing greater analysis and awareness in real time and in data analysis

Perhaps one of the most interesting features in Wave is (and will be in AR Wave) the ability to playback overlay data from a previous time in context (for example, environmental data from a year ago) – see writing as real time performance

We will set up a participatory web site to accompany the panel called, “What would you add to the world?” Conference attendees will be asked to augment some real world photos and other media. People would have to send ideas as png images‬ with transparent backgrounds and the ‪website will be set up so everyone’s layer-submissions can be toggled on/off. ‬ We will also set up “How would you edit layers?” for people to participate collaboratively live in Google Wave.

Panel Members:

Jeremy Hight – modulated mapping – locative narratives-channels of augmentation-end user adjustable interface controls-communal AR development and social networking tools fused
to AR and geo spatial augmentation- AI as interface and geo spatial news, user channels and back end AI

Joe Lamantia – UX: the experience of creating and interacting with social augmented experiences – concepts and models for understanding and contributing to shared augmented experiences, such as the social scales for interaction

Tish Shute – AR and networked urbanism, connecting people to environments through games and social interaction, making the invisible visible – AR and new public infrastrucures, AR & ubicomp – citi-sensing and citizen sensing

Thomas Wrobel – creating distributed multiuser AR using current infrastructures and protocols (Wave enabled AR)

Photo of Tish Shute

Tish Shute

Ugotrade

Tish Shute, founder of Ugotrade -

My career in new media and technology began with work in motion control, robotics, and special effects for film, television, theme parks and aerospace. I continue my interest in innovation and paradigm shifts as an entrepreneur and writer interested in sustainable living,ubiquitous computing, augmented reality, and virtual realities in world 2.0.
I have an M.Phil., “Culture and Media,” from NYU, Dept. of Anthropology, where I pursued my interest in the uptake of new technology from a more academic POV.

Jeremy Hight

Mission College, CA

have shown work in locative media, new media, sound art, text and image art and text art at galleries and museums internationally.

author of “Modulated Mapping” (http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf)

essay on research and development of intuitive open source mapping/ web 3.0

author of “Immersive Event Time” (/piim.newschool.edu/journal/issues/2009/01/)

essay on new ways to measure events in time…and time itself…..looks at ar, immersive graphic visualizations and game interface recreations of historical events with ai

Author of “Immersive Sight in the Third Space” (neme.org/main/645/immersive-sight)
Essay on how to combine augmented reality,virtual reality, aspects of locative media in exhibition spaces to create architectural analysis integrated spatial graphic design and interactive augmentation

author of “Narrative Archaeology”

I have published 20+ essays in various fields connected to art and technology, science and art and language,semiotics and creative writing. I am currently co-editing an issue of the leading journal in art, science and technology as well as a book. I curate on line exhibitions and advise festivals. I have been a professor (english, multimedia, design theory art history) for 9 years and love it.

Photo of Joe Lamantia

Joe Lamantia

Oracle Endeca

A veteran architect, consultant, and designer, Joe Lamantia has been an active member and leader in the user experience community since 1996. Joe has crafted successful user experience strategies and created innovative solutions for clients in a wide variety of industries and settings, ranging from Fortune 100 enterprises to local non-profit organizations, digital product companies, and social media. Joe is the creator of the leading freely available tool for card sorting, a frequent writer and speaker on future directions in user experience, and creator of the Building Blocks for Portals design framework.

Joe is currently based in Amsterdam, working as a user experience strategy consultant for a global media agency. He blogs regularly at www.joelamantia.com.

Photo of Sophia Parafina

Sophia Parafina

LDS

Sophia is freelance geographer. Previously, she was Director of Operations at OpenGeo and a Adjunct Professor at Hunter College. Sophia organized Augmented Reality DevCamp NYC and hosts AR New York Meetups. Before joining OpenGeo, she was a founder and CTO of IONIC Enterprise (acquired by Erdas) and Senior Program Manager for In-Q-Tel, the CIA’s venture capital fund.

anselm hook

Meedan

Developer and Designer – building collaborative communication tools to help individuals and communities see, communicate and act. Presently CTO Meedan. Previously directed development of Platial among other projects.