Flavio Grimaldi
  • Home
  • About
  • Term I
    • Landing
    • Documenting Design
    • Digital Fabrication
    • The machine Paradox
    • Design Studio I
      • Design Space
      • Roles of Prototyping in 1PP Research through Design
      • Collective Design Studio
      • Design Dialogues I
    • Living With Your Own Ideas
    • Extended Intelligences
    • Design With Others
    • Biology Zero
      • Designing a GMO
    • Agriculture Zero
      • Hack-ing Farm-ing
  • Term II
    • Design Studio II
      • Intervention I: Hacking the Algorithm
      • Ethics of Design
      • Intervention II: Public Space or Contested Territory?
      • How to Design a Research Project
      • Intervention III: Makerspaces catalyst for social change
      • Transition Design
      • Intervention IV: Anti-Racist Community Days
      • Design Dialogues II
    • Situated Design Practices
    • Communicating Ideas
    • Living Materials
    • Collective Intelligence
    • H(n)MI
    • Micro_Challenge
    • Micro_Challenge
    • Extended Intelligence II
  • Term III
    • Design Studio III
    • Communicating Ideas
    • FAB-Challenge
  • Pictorial
  • Final Project
  • Linkedin
  • Instagram
  • Email
Powered by GitBook
On this page
  • DAY I
  • DAY II
  • DAY III
  • Sensing and Sensation Symposium with TRX
  1. Term II

H(n)MI

Faculty: Citlali Hernández, Lina Bautista

PreviousCollective IntelligenceNextMicro_Challenge

Last updated 2 months ago

DAY I

The human body is more than just a passive vessel, it is an active participant in communication, perception, and interaction. What is a body? In the Human-Nonhuman Machine Interaction (H(n)MI), the body is not merely a subject but a medium for engagement with technology.

Medium of tool for expression

body {
      switch(experience)
      Case 1:
      individual();
      break;
      Case 2:
      politic();
      break;
      Case 3;
      social();
      break;

Our physical form, movements, and even subtle gestures can shape digital experiences, making interaction more intuitive and immersive.

To explore this, we assemble a DIY soft pressure sensors, flexible and sensitive to pressure.

By layering Velostat (a pressure-sensitive material) with conductive fabric, we built sensors capable of transforming physical interaction into measurable electrical signals.

These sensors were then integrated with Arduino UNO and through Processing, we visualized these signals, mapping body movements into digital responses in real time.

This exercise reinforced the idea that interaction doesn't have to be limited to screens, buttons, or keyboards; instead, the body itself can be an expressive interface, enabling richer, more embodied ways of interacting with technology.

// CODE TO READ TOUCH SENSOR
const int sensorPin = A0; // select the input pin for the sensor
int sensorValue = 0;      // variable to store the value coming from the sensor

void setup() {
  Serial.begin(9600); // initialize serial communication at 9600 bits per second
}

void loop() {
  sensorValue = analogRead(sensorPin); // read the value from the sensor
  Serial.println(sensorValue);         // print the sensor value to the serial monitor
  delay(100);                          // wait for 100 milliseconds
}
// CONNECTING ARDUINO TO PROCESSING
  import processing.serial.*;
  Serial mySerial;

  String myString;
  int nl=10;
  float myVal;

  void setup()
  {
    size(800, 600);
    printArray(Serial.list());
    delay(5000);
    String myPort = Serial.list()[8];
    mySerial = new Serial(this, myPort, 9600);
  }
  void draw(){
    while (mySerial.available() >0) {
      myString=mySerial.readStringUntil(nl);
      background(255, 0, 255);
      if (myString !=null) {

  myVal=float(myString);
        println(myVal);
        circle(width/2, height/2, myVal);
        smooth();
      }
    }
  }

DAY II

DAY III

In the third session, we explored serial communication between Arduino and p5.js using the Web Serial library. 👇

They controlled the size and color of a circle based on data from a pressure sensor and utilized millis() to create time-based animations. This integration of real-time data, dynamic visuals, and user interaction showcased how technology can be used to interactively represent bodily states.

Check the repository here 👇


Sensing and Sensation Symposium with TRX

The second session of the workshop has been introduced sound as another dimension of interaction. Sound is an incredibly powerful medium, we worked with microphones and sensors, inputs that were processed using Arduino and then translated into interactive elements within.

P5.js
GitHub - gohai/p5.webserial: A library for p5.js which adds support for interacting with Serial devices, using the Web Serial API (currently supported on Chrome and Edge).GitHub
GitHub - Flax96-BIT/H-n-MI-Team_02: Repository of Carlos, Flavio and Mohit about the H(n)MI workshopGitHub
Logo
Logo