Note from ETIS '16 -- Day 1

Opening


background, opportunity


[Evolution of Tangible Interaction]

*References
  1. Bishop, D. (1992). Marble answering machine.Royal College of Art, Interaction Design.
  2. Wellner, P., Mackay, W., & Gold, R. (1993). Back to the real world.Communicationsof the ACM,36(7), 24-26.
  3. Fitzmaurice, G. W. (1996).Graspable user interfaces(Doctoral dissertation, University of Toronto).
  4. Ullmer, B., & Ishii, H. (2000). Emerging frameworks for tangible user interfaces.IBM systems journal,39(3.4), 915-931.
  5. Hornecker, E., & Buur, J. (2006, April). Getting a grip on tangible interaction: a framework on physical space and social interaction. InProceedings of the SIGCHI conference on Human Factors in computing systems(pp. 437-446). ACM.
  6. First International Conference on Tangible, Embedded and embodied Interaction conference http://tei-conf.org/07/
  7. Ishii, H., & Ullmer, B. (1997, March). Tangible bits: towards seamless interfaces between people, bits and atoms. InProceedings of the ACM SIGCHI Conference on Human factors in computing systems(pp. 234-241). ACM.
  8. Vaucelle, C., & Ishii, H. (2008, September). Picture this!: film assembly using toy gestures. InProceedings of the 10th international conference on Ubiquitous computing(pp. 350-359). ACM.
  9. Van Den Hoven, E., & Mazalek, A. (2011). Grasping gestures: Gesturing with physical artifacts. Artificial Intelligence for Engineering Design, Analysis and Manufacturing, 25(03), 255-271
  10. Ishii, H., Lakatos, D., Bonanni, L., & Labrune, J. B. (2012). Radical atoms: beyond tangible bits, toward transformable materials.interactions,19(1), 38-51.
  11. Van den Hoven, E., van de Garde-Perik, E., Offermans, S., van Boerdonk, K., & Lenssen, K. M. H. (2013). Moving Tangible Interaction Systems to the Next Level.IEEE Computer,46(8), 70-76





  • Habituated interaction: understanding habits as developed though mutual interaction, where people and technologies adapt to each other over time.
  • Inherently meaningful, inter usability: Understand tangible interaction as an approach to designing interactive and smart objects which respect and export the user’s bodily skills and which build upon the notion that our traditional phy…


Tester’s Law — Larry Tesler, former VP of Apple

  • You cannot reduce the complexity of a given task beyond a certain point. Once you’ve reached that point, you can only shift the burden around.








IoT school — gathering weather info from a school in Africa, to grow plants in a school in UK

Ugle — wooden owl that can be controlled over the internet 
(Many other interesting project in the website)

The internet of useless things — funny examples of IoTs

Intel, Sarah Gallacher - Internet of Tangible Things, VOX BOX
Modular design, group activity

people assume: card = data holder. not connected to the internet because it doesn’t look like a computer

voxbox + talk box

physikit

smartcitizen — urban sensing. crowd funding.

physikit + smartcitizen — made people engaged in utilising smart citizen. before this, smart citizen had high drop rate. published CHI ’16.




pointing tasks
[Pointing tasks]

  • direct pointing, indirect pointing
  • absolut pointing relative pointing
  • position control, rate control
  • isotonic (without resistance) device, isometric (with resistance, the device doesn't move but recognize the force on it.) device

introduction of structure light scanning (early 2000s)
  • works well in lab, but not in outside (because of sunlight, etc.)
  • then photo scanning (photographic acquisition): fragment problem. how to put them together? [Reuter, Riviera, Couture, Mahout, Espinasse, acm jock 2010] -> by manual interaction. clutch to keep the objects’ position (Buxton’s state model)

Noise of 3d information (from scanning)
  • how to distinguish large/small noise? to interpolate.
  • both size of couverture would be informative. use a method with circle with dynamic size

3D geometry inspection with AR
  • finger is tracked (leap motion), there is virtual torch to show additional information (processed image to see the statue better.) [Ridel, Reuter, Riviera, Laviole, Mellado, Couture, Granier, ACM JOCCH 2014]
  • direct pointing

The future
  • using new devices to put this setting in mobile context
  • example: finding vein to inject.
  • projecting tech is getting mature (don’t need lamp. red/green/blue laser synthesise color. can change focus to objects in distance and close at the same time) (projector is small as as a coin) (lenovo has laser system in mobile. not commercialised yet)
  • example: HideOut project: mobile projector interaction with tangible … by disney
  • vision par ordinaeur: normal lends + micro lens. can generate different focus from static image. (4d image). want to use this tech. in microscopic level.
Tech review

Approach

  • the smileyometer (interesting and make sense because the target users are kids)
  • user entered design
  • Conci, Andrea, Cristina Core, and Fabio Morreale. "Weighting Play and Learning in Interaction." (2015).(http://palx.inf.unibz.it/papers/Conci.pdf)



Interaction: creation, mediation, …
[victor 2014] the humane representation of thought
[Raskar 2001] Interacting with Spatial Augmented Reality
"Computing is not about computers any more. It is about living." - Nicholas Negroponte

Clock
TEI ’16 — tangible viewports: getting out of flatland in desktop environment


Physiological computing
  • exposing brain activity, internal (lung, heart) process ->related to quantify self
  • Inner Garden. TEI ’16 WIP. 



Tangible Game


Stroyboard for new tangible quantified self applications


subjective information on Rennes map

Interactive maps
  • tangible objects on map (2D/3D)
  • Ebert, Weber, Cemea, Putsch, (map filter)
  • simulation http://cp.media.mit.edu/… locating houses using lego blocks
  • Couture, Riviere, Euter, GeoTUI
  • Ma, Sindorf, Liao, Frazier, Using… (interaction in museum)
  • Sandbox

Current work
  • interaction: zooming (slider), panning, different views (3-state slider)
  • Exploration input modalities for interacting with augmented paper maps. Chatain, Demangeat, 

Future work
  • users consider the interface paper not just a paper anymore. e.g., they were afraid of touch it if they break the system, they thought the system is malfunctioning because the paper moved while they were drawing
from iSolutions health

Prior work
  • bachelor thesis by tommy vin Lahm: doctors want to see the place where the marker is.
Current work
  • "Magic Lens”, a acrylic glass ring
  • enlargement within the lens or side-by-side
  • results were similar. not significant different.
  • Feedback: good. we can stand up again and discuss, like using films on a wall. much better for the back. 
  • Why do we need this physical thing? why not virtual lens? -> physical representation.
? have you published
? have you though of/were there needs for multi lenses?
? How recognise the lens (marker)?
? Why didn’t you use direct control of the lens?



gestures: 6 types, rotating in 2 directions, moving circularly in 2 directions, moving straightly to left/right
adjusting shape of tokens (notches) -> elicit users to grab the token properly, easy to be recognised by the system -> great improvement!

Applications: map (different layers of info), game, access control

Future work
  • releasing one finger. at the moment, to recognise the token, 3 fingers should be used to place a token. after that, users can use only two fingers.
  • increase gesture vocabulary: e.g., on surface/ off surface