Show HN: OpenSnowcat – A fork of Snowplow to keep open analytics alive
I’ve been a long-time Snowplow user and unofficial evangelizer. I have deep respect for its founders, Alex and Yali, who I met a few times.
What made me fall in love with Snowplow was that it was unopinionated, gave access to raw event data, and was truly open source. Back in 2013, that changed everything for me. I couldn’t look at GA the same way again.
Over the years, analytics moved into SQL warehouses driven by cheaper CPU/storage, dbt, reproducibility, and transparency. I saw the need for a democratized Snowplow pipeline and launched a hosted version in 2019.
In January 2024, Snowplow changed its license (SLULA), effectively ending open-source Snowplow by restricting production use. When that happened, I realized the spirit of open data and open architecture was gone.
A week later, I forked it, I wanted to keep the idea alive.
OpenSnowcat keeps the original collector and enricher under Apache 2.0 and stays fully compatible with existing Snowplow pipelines. We maintain it with regular patches, performance optimizations, and integrations with modern tools like Warpstream Bento for event processing/routing.
The goal is simple: keep open analytics open.
Would love to hear how others in the community think we can preserve openness in data infrastructure as “open source” becomes increasingly commercialized.
That's it, I should have posted here earlier but now felt right.
Comments URL: https://news.ycombinator.com/item?id=45685793
Points: 65
# Comments: 15
Thu, 23 Oct 2025, 7:24 pm
Show HN: I built a tech news aggregator that works the way my brain does
An honest to god, non-algorithmic reverse chrono list of tech news that passes my signal-to-noise tests, updated hourly.
A lightweight a page design as I've been able to keep; simple, clean, fast. No commercial features or aspirations - this is a passion project, something I've been fooling around with on and off for decades.
There's a "Top" view too with an LLM edited front page & summary, and categorized views for a large number of topics - see the Directory. A few more buried features to explore, but the fundamental use case is pop in, scan, exit - fast and concise.
Your feedback would be appreciated!
Comments URL: https://news.ycombinator.com/item?id=45684689
Points: 148
# Comments: 83
Thu, 23 Oct 2025, 5:48 pm
Show HN: Tommy – Turn ESP32 devices into through-wall motion sensors
Hi HN! I would like to present my project called TOMMY, which turns ESP32 devices into motion sensors that work through walls and obstacles using Wi-Fi sensing.
TOMMY started as a project for my own use. I was frustrated with motion sensors that didn't detect stationary presence and left dead zones everywhere. Presence sensors existed but were expensive and needed one per room. I explored echo localization first, but microphones listening 24/7 felt too creepy. Then I discovered Wi-Fi sensing - a huge research topic but nothing production-ready yet. It ticked all the boxes: could theoretically detect stationary presence through breathing/micromovements and worked through walls and furniture so devices could be hidden away.
Two years and dozens of research papers later, TOMMY has evolved into software I'm honestly quite proud of. Although it doesn't have stationary presence detection yet (coming Q1 2026) it detects motion really well. It works as a Home Assistant Add-on or Docker container, supports a range of ESP32 devices, and can be flashed through the built-in tool or used alongside existing ESPHome setups.
I released the first version a couple of months ago on Home Assistant's subreddit and got a lot of interest and positive feedback. More than 200 people joined the Discord community and almost 2,000 downloaded it.
Right now TOMMY is in beta, which is completely free for everyone to use. I'm also offering free lifetime licenses to every beta user who joins the Discord channel.
You can read more about the project on https://www.tommysense.com. Please join the Discord channel if you are interested in the project.
A note on open source: There's been a lot of interest in having TOMMY as an open source project, which I fully understand. I'm reluctant to open source before reaching sustainability, as I'd love to work on this full time. However, privacy is verifiable - it's 100% local with no data collection (easily confirmed via packet sniffing or network isolation). Happy to help anyone verify this.
Comments URL: https://news.ycombinator.com/item?id=45684230
Points: 77
# Comments: 59
Thu, 23 Oct 2025, 5:04 pm
Show HN: Git for LLMs – A context management interface
Hi HN, we’re Jamie and Matti, co-founders of Twigg.
During our master’s we continually found the same pain points cropping up when using LLMs. The linear nature of typical LLMs interfaces - like ChatGPT and Claude - made it really easy to get lost without any easy way to visualise or navigate your project.
Worst of all, none of them are well suited for long term projects. We found ourselves spending days using the same chat, only for it to eventually break. Transferring context from one chat to another is also cumbersome. We decided to build something more intuitive to the ways humans think.
We started with two simple ideas. Enabling chat branching for exploring tangents, and an interactive tree diagram to allow for easy visualisation and navigation of your project.
Twigg has developed into an interface for context management - like “Git for LLMs”. We believe the input to a model - or the context - is fundamental to its performance. To extract the maximum potential of an LLM, we believe the users need complete control over exactly what context is provided to the model, which you can do using simple features like cut, copy and delete to manipulate your tree.
Through Twigg, you can access a variety of LLMs from all the major providers, like ChatGPT, Gemini, Claude, and Grok. Aside from a standard tiered subscription model (free, plus, pro), we also offer a Bring Your Own Key (BYOK) service, where you can plug and play with your own API keys.
Our target audience are technical users who use LLMs for large projects on a regular basis. If this sounds like you, please try out Twigg, you can sign up for free at https://twigg.ai/. We would love to get your feedback!
Comments URL: https://news.ycombinator.com/item?id=45682776
Points: 73
# Comments: 23
Thu, 23 Oct 2025, 3:12 pm
Show HN: Deta Surf – An open source and local-first AI notebook
Hi HN!
We got frustrated with the fragmented experience of exploring & creating across our file manager, the web and document apps. Lots of manual searching, opening windows & tabs, scrolling, and ultimately copying & pasting into a document editor.
Surf is a desktop app meant for simultaneous research and thinking to minimize the grunt work. It’s made of two parts:
1) A multi-media library where you can save and organize files and webpages into collections called Notebooks.
2) A LLM-powered smart document which you can auto-generate using the context from any stored page, tab or entire notebook. This document contains deep links back to the source material — like a page of a PDF or timestamp in a YouTube video. Unlike Deep Research products (or NotebookLMs chat) the entire thing is editable. The user also stays in the loop.
With a technology like AI, context / data is proving to be king. We think it should stay under the user’s control, with minimal lock in: where you can own & export, and plug & play with different models. That’s why Surf is:
- Open Source on GitHub
- Open (& Local Data): the data saved in Surf is stored on your local machine in open and accessible formats and mostly works offline.
- Open Model Choice: you can choose which models you use with Surf, and can add custom & Local LLMs
Early users include students & researchers who are learning and doing thematic research using Surf.
Github repo: https://github.com/deta/surf/
Website: https://deta.surf/
Comments URL: https://news.ycombinator.com/item?id=45680937
Points: 118
# Comments: 39
Thu, 23 Oct 2025, 12:11 pm