Final Midterm Project Documentation
Aspirational project description
Living Room Table center piece that has all the tools to spy on my roomates, yet it chooses not to. Instead of using the tools for survaillance, they’re choosing to preserve privacy, honesty and trust around home sharing and create vivid memory representations.
Reality-based project description
Center piece measuring a living room’s ecology. An exploration of what it means to obtain data through DIY means that large companies spend millions of dollars stockpiling. Combines number of devices connected to WiFi with magnetometer and presence sensor to define a loose picture of what technological and human forces surround the house at any given moment.
Angles of this project (written on Week 6)
- A possible pitch could be trying create FindMy for your roomates and communities but without big tech. Build / Breakup with FindMy or other tools meant to help document community.
- Having all the tools for surveillance at a hardware level but not using them at the software level. Whoever you live with can totally spy on you. And if families knew how to, they would.
- The idea that a space can call you towards it. If something is happening in the home, it’s drawing you towards the space. A beautiful wearable representation of this would be a Woven Light keychain?
Dashboard
Data Conclusions / Updates since last session
- x, y, z axis of the magnetometer don’t hold much meaning. A square root of the 3 values combines it in an interesting way.
- However, there’s still very much an issue. A microwave has registered at 180uT and 90uT.
- PIR Digital Value - currently seen as Motion detected is very much useless.
- With 1 PIR, it’s indistinguishable me walking around while my roomates sleep from all 3 of us at home.
- WiFi Devices also seems to be spotty. Very little relationship to when flatmates get home, purely because we switch between devices regularly. TV is also included in this. 3D Printer as well. To have it be done accurately, I’d need to remove known devices. Maybe it is okay to check for a specific person’s devices presence but only to mantain a counter of who’s home.
- To be clear, I do already have access to this information through Apple’s Find My. The question is, what is a way that creates lovely paintings and art visualization but doesn’t provide external actors with a way to map out household movements for nefarious reasons.
- Graph on dashboard now allows me to overlay the data.
Rough state buckets for House Energy: Connected; Isolated and Empty
Each 1-hour time bucket is classified into one of three states using two signals from the sensor data: PIR Raw standard deviation (σ) measures activity in the sensor room, and WiFi device count average measures how many people are likely home.
Thresholds (could be tuned further!):
PIR_ACTIVE_THRESHOLD = 40Above this means active presence in the roomPIR_QUIET_THRESHOLD = 15Below this means nobody moving at all.WIFI_HOME_THRESHOLD = 14Above this means people are home. Didn’t manage to get a clear idea of how many.
State | Condition | Meaning |
Connected | PIR Raw σ > 40 | Active presence in the sensor room, regardless of WiFi count |
Isolated | PIR Raw σ ≤ 40 AND (WiFi ≥ 14 devices OR no WiFi data) | People are home but not in this room |
Empty | PIR Raw σ ≤ 40 AND WiFi < 14 devices | Quiet room, few devices on network |
Empty (also) | PIR Raw σ ≤ 15 AND WiFi < 14 devices | No movement at all, few devices |
Edge case: When PIR σ is between 15–40 and there's no WiFi data available (wifiAvg < 0), we’re defaulting to Isolated, assuming people might be home since it can't tell from WiFi alone. Our WiFi goes down constantly.
Note: The Z-axis magnetometer standard deviation is computed and displayed in the energy strip tooltip, but it does not drive the classification. It doesn't vary meaningfully with human activity based on 16 days of observed data.
Data Processing Pipeline (Chart)
The chart runs this pipeline on every 5-second fetch cycle:
- Ingest: parse line-delimited JSON from
log.json, only processing new lines since the last fetch - Filter: keep only points within the selected time window (1h / 6h / 24h / 7d / 30d)
- Downsample: min/max bucketing reduces up to 520,000 points → ~1,200 drawn points, preserving spikes and dips
- Draw: Each checked series is drawn independently on a shared canvas with dual Y-axes (left = µT, right = PIR/WiFi).
Next steps
I realized the only way this works really is through by using a microphone and doing FFT analysis only.
PIR’s optimization would be by having several. I got small ones with 3m range that I liked but never got around to redesigning the project. 3 in different directions would be ideal.
System Diagram
Wiring Diagram
Enclosure Experiments
I experimented with different designs. Ultimately, I want the final one to be similar to the above.
Before designing the final version, I need to protoboard the circuit so that I can be working with more flexible dimensions.
Current enclosure
Source Code:
Embed GitHub


Credits and development notes
A number of early trials to check wiring and base function were done through library examples & Tom’s web page (TFT screen, Bosch’s and Waveshare’s documentation for the magnetometer, PIR too) and then combined at different points using Claude Code; created documentation for Readme file. Prototypes started with AI escalated into fuller features. There’s some regrettable choices, however… it works and it’s understandable to me, so it’s unclear how to feel about it.
Will compile a list of references ASAP.
This blog is, however, 100% handwritten.
Resource List
Code examples for initial build here on my GitHub:
art via emotion through data
Inspiration
I aim to create visuals along these lines in a next phase of this project. My blocker right now is having arrived at the conclusion that my sensors aren’t massively reliable if I actually want any meaning to this, but also haven’t looked too much into what libraries I could use without getting too deep into shaders.
The end goal is to scroll through a timeline, and in a heatmap fashion, we see date/time and the energy of the home.
Week 7: Data Analysis
Data Unknowns
There’s several options….
- Manual annotation - either via SSH scripts on my phone OR through web interface
- Averaging, smoothing, or rolling windows
- Measuring anomalies or thresholds
- Statistical aggregation (mean, median, standard deviation)
- No cross-correlation between series
Done
✅ Fixed wiring issue from last week;
Now it’s standalone!
✅ Script to try connecting to Home and Sandbox WiFi Networks;
✅ ONLY if device is at home, check how many devices are active; starting logging which WiFi it’s connected to;
✅ Adding this to dashboard;
🤔 Did not implement, but researched/soldered mic; Thermal camera and different PIRs’ range
Important to do
Next steps depending on how much energy I can budget for this
- Redesign enclosure to have different PIRs to be a center piece
- Intersect data in a more interesting way
- ✨ Artistic output ✨
- Light center piece / wall mounted matrix;
little joys (SSH’ing into my server from my phone)
Adding active device count to dataset
Scanning devices within Networks
Questions
- Excluding a scenario where someone is physically near my home network, what are the issues of sharing my WiFi Network name and password - to an LLM or online?
- Or even the information of how many and which devices are connected to WiFi?
sysctl net.link.ether.inet.max_age
// Check ARP cache entry lifetime in seconds.
// Mine is 1200 seconds or 20 minutes.
// Since pinging every 20 minutes is perfect for the usecase, we don't need real time.Added ESPing library
Project Documentation should include,
- uT MicroTesla;
- re-do dashboard to be external facing;
- do annotations;
- commenting my code to learn it;
Ask Yuliya for any data analysis;
Data Visualization
- Canvas 2D API (ctx.beginPath(), ctx.lineTo(), ctx.stroke()) but could use Chart.js
- d3.js
- p5.js: x.y of the pixel, learning how to that;
Week 6 - Device 1 - Environmental Data with MQTT
http://living.ines.systems/[ Add my experiments around the ESP32 for Body Pose; and with Seeed Studio Display]
MQTT
Instead of sending a message every ~5ms, could use the ArduinoMqttClientTouchReadESP32.ino example to send data based on a certain threshold only. Is there any downside vs sending message based on time?
Running on VPS
Wifi/DensePose experiment
I found a project that uses an ESP32-S3 (or ideally more) and a wifi router to determine body pose, breathing, heartbeat, motion and presence within 5m of the Wifi router.
I was able to set up the Docker Container but the ESP32-S3 couldn’t connect to my home router, due to it being WPA3. I wasn’t able to test with a personal hotspot because I changed hotspot names, but that might have worked.
Question: Does switching my home router to WPA2/WPA3 mixed mode have any unintended consequences?
Links
“The ESP32 nodes stream binary CSI frames over UDP to port 5005. See Hardware Setup for flashing instructions.”
Next steps and existential programmatic questions and interesting finds and thought ships (trains are too slow for this analogy ha)
ArduinoMqttClientWithWillintroduces MQTT Last Will and Testament (LWT). It lets the broker automatically publish a "device offline" message if sensor drops unexpectedly, useful for a long-running room monitor meant to be stationary and unmonitored.- ✅ How to RTC without a physical module extra module with battery? API works! NTP (Network Time Protocol); ESP32 fetches real clock time from the internet right after WiFi connects. Built into the ESP32, no extra library needed.
- Currently editing on VS code on a local / Github folder and then copying into VPS. Wonder if there’s a better way? Feels silly and prone to confusion. OOOOR could be automated somehow. By creating a Git system for my personal computer and my VPS? Or making a little alias to automatically copy/replace files once edits have been done.
- Could I include compiling on the Arduino IDE as part of some end-of-bug-fixing script? Or only for CircuitPython compiling?
- Add script to check different WiFi networks for connectivity
My computer:
scp script.js ines@living.ines.systems:/tmp/
VPS:
ssh ines@living.ines.systems
sudo cp /tmp/script.js /var/www/html/
Find it fascinating that I don’t need to restart any PID on my VPS after changing the files… Is it because it’s JS specifically? or a browser client? Can I set any server up to update automatically after copying a file into it?
I remember learning programing (Ruby on Rails) that server side data would be served once and not be able to be changed through user action. But I suppose this is not true in applications. One could have the PID just running in the background, VPS never shuts, and while the same data wouldn’t change, we could send requests to swap out just portions of it. I suppose that’s what React is. It was exciting because otherwise web pages were done to be updated/served in full. But it’s not that it’s not possible to develop such arquitecture.
This modular thinking would serve AI contexts well in interfaces. RE Anthropic Hackathon
Is there a test mode or do I always just ship to “production”?
Serving to localhost wouldn’t be local to me — not same network. So… I would be deploying to a subdomain or another test website setup done for this purpose. But because I have no users or important DB/Data, then I don’t have to care…
In class demo issue: SparkFun’s GPIO12 was wrongfully connected. It's a strapping pin that controls flash voltage. If it's HIGH at boot, flash runs at 1.8V instead of 3.3V. The flash chip needed 3.3V to write properly.
Solid enclosures and circuits; next steps
- Protoboard: Breadboard wire is enemy #1 of small devices. Purchased very very tiny terminal blocks.
- Add USB port opening.
- Test how exposed magnometer needs to be to be functional.
- EG. very thin layer of PLA, very little in-fill in that specific section? Do I need to edit this in G-Code or does my slicer have options?
- Learn CAD methods for twist locks and clasps. Enclosures need to be open and repairable. Screws are sometimes conceptually... screwed to remove.
- GrabCAD for 3D models of the eletronics and then extrude those out of the enclosure instead of modelling from scratch.
- Add LIPO breakout board
- Can they take any voltage power source?
Interface
- Display: What’s useful to know when debugging?
- PWM an LED to show you if it’s connected.
- Aram has very small LEDs.
- Use color associations and pattern recognition wisely.
- Work towards reducing number of buttons.
- Audio is useful for events (EG. temperature reaching X; print is ready).
Data / sensor selection next steps
I’m still fundamentally missing data outcomes that I find exciting.
Options that require purchasing/sourcing
- Get 3 extra PIRs + XIAO ESP32 and place them on individual room doors.
- Array of PIRs but not in the individual room doors, instead facing different directions on the table center piece or in key spots of the apartment.
- Get a magnometer with more range.
- Get a microphone that only communicates to MQTT whether there’s sound and at what intensity. No sound files transmitted for privacy. Sound is a huge hallmark of motion and aliveness.
- Can we handle privacy for sound at the hardware level, instead of software? Is that just a very crappy microphone?
Options with my existing hardware
- Second biggest hallmark of motion and aliveness is… motion. Get Wifi Depth Pose to work since I have hardware already enough to triangulate multiple people. Need to change my home router to multimodal WPA2/WPA3. Data includes body pose, breathing, heartbeat, motion and presence.
- Possible limitations around having the router not in the living room. Solve with 3rd ESP32-S3 (use Rev TFT board).
- Q: If done with Meshtastic, could we look at body pose of…. a whole city? 😮 !!!!
- Put my 8 VLX distance sensors to work and form a “sensing” cage to map out the space in Three JS.
Without changing anything
- Can use the PIR, based on clusters of times when it gets activated to see activity.
Week 4 and 5 - Device 1
My initial goal was to detect the (electromagnetic) life of a room by picking up magnetic field disturbances from nearby electronics (phones, laptops, etc.), but through testing I realized the range is very small (2 to 10cm) and there’s too much noise for this to be appropriate. I’m now layering sensors for an ambient environment logger that builds a continuous, multi-dimensional portrait of a room over time. It makes visible what is normally invisible: the electromagnetic texture of a space, the presence or absence of people, the rhythms of activity and rest that define how a room actually lives and the people who occupy it. The output should be an art piece inspired by Daniel Canogar’s Billow III (2020). I’m also hoping to scale from there some ‘living’ furniture pieces that trigger local AI workflows, but this in all likelihood will take form after graduation.
# Check IP Address and update Arduino sketch
ipconfig getifaddr en0
# Change
# Terminal 1: receives ESP32 data
nc -l 8080 | tee -a log.json &
# -a (append) to preserve previous data across reconnects.
# Terminal 2: serves the dashboard
python3 -m http.server 3000{"device":"RoomSensor","ts":1305264,"mag":{"x":16.00,"y":4.00,"z":-9.00,"mag":18.79},"pir":{"raw":0,"motion":false}}
Could use a Web Socket.
V1

Eletronics in my project
- Sparkfun ESP32 Thing Plus (USB-C)
- Waveshare BMM150 3-Axis Magnetometer Sensor
- SparkFun OpenPIR. Will mount this or a distance sensor facing a doorway or desk area in a wirelessly connected separate device.
Next steps
- Attach to virtual server and leave it at home
- Collect data over time
- Give meaning to PIR data
- Establish baseline for magnetometer
![Instagram Kai Lab on Instagram: "CMYK Searchspace 2.0 [ Aluminum Limited ] . First batch ready for delivery, second batch fast on its way to completion! Third batch available for pre order on the KAI shop! . CMYK Searchspace 2.0 is a fully analog colour synthesizer. No flicker, no PWM dimming a circuit from your old analog electronics text books. . . Special thanks to @qiqi77art for all your help on assembling these! . #design #art #lightart"](https://static.cdninstagram.com/rsrc.php/v4/yI/r/VsNE-OHk_8a.png)