Was performed at:
This performance is part of a practice-based PhD that comprises embodied and choreographic techniques to inform the development of methods and tools for designing of web-interfaces — driven by transparency and legibility. Joana Chicau will present early prototypes such as custom interfaces and plug-ins that aim at responding to the opaque algorithmic models embedded in pervasive web interfaces. For instance, by exposing user-tracking and phenomena such as ‘dark patterns’ through live performance. She will interweave live coding using JavaScript commands on the web console and custom browser tools with voice narration following a choreographic perspective.
The abstract is displayed here for proof-reading and will only be part of the published proceedings, not of the final version of this web catalogue.
This performance is part of a practice-based PhD that comprises embodied and choreographic techniques to inform the development of methods and tools for designing of web-interfaces — driven by transparency and legibility. It draws from a body of work1 that explores how choreography provides alternative insights on how algorithms shape web environments.
In a similar vein, various embodied tactics have been emerging for making sense of the complexities in algorithmic systems — including Experiential AI2 that focus on felt experience and Graspable AI3 that uses physical artefacts as a relational way of interpreting algorithms.
Joana Chicau will present early prototypes such as custom interfaces and plug-ins that aim at responding to the opaque algorithmic models4 embedded in pervasive web interfaces. For instance, by exposing user-tracking and phenomena such as ‘dark patterns’5 through live performance.
She will interweave live coding using JavaScript commands on the web console and custom browser tools with voice narration following a choreographic perspective.
J. Chicau, “Choreo-Graphic-Thinking”. Available: https://joanachicau.com/backstage.html. [Accessed: 11-Dec-2022]. ↩
D. Hemment et al., “Experiential AI,” AI Matters, vol. 5, no. 1, pp. 25–31, Apr. 2019, doi: 10.1145/3320254.3320264. ↩
M. Ghajargar, J. Bardzell, A. M. Smith-Renner, K. Höök, and P. G. Krogh, “Graspable AI: Physical Forms as Explanation Modality for Explainable AI,” in Sixteenth International Conference on Tangible, Embedded, and Embodied Interaction, Daejeon Republic of Korea, Feb. 2022, pp. 1–4. doi:10.1145/3490149.3503666. ↩
S. B. Pold, “New ways of hiding: towards metainterface realism,” Artnodes, 2019, no. Num. 24, pp. 72–82, 2019, doi: https://doi.org/10.7238/a.v0i24.3283. ↩
Y. Rogers, P. Dourish, P. Olivier, M. Brereton, and J. Forlizzi, “The Dark Side of Interaction Design,” in Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu HI USA, Apr. 2020, pp. 1–4. doi:10.1145/3334480.3381070. ↩