Posts

Community-learned Language Models

With help of Sameera Mudgal and other collaborators, we have shared a proposal to extend the ideas of Sun Shines Bright to the community.

Embracing grounding as a sharing and learning experience for the community over making and hacking.

Sun Shines Bright reimagines our relationship with AI and the cloud by demystifying and DIY-ing local and self-hosted AI servers, and cultivating community-led AI systems. The project proposes that grounding technology is as much about building agency and accessibility of usage for diverse creator communities as it is about the materiality of systems to build it. We propose that to “ground the cloud”, we need to embody the principles of care and camaraderie in our technological interventions by:

Rather than naively offering alternative prototypes and end products for one-time use and showcase, we propose simple, radical ways of community building and facilitation that respond with deep, situated and diverse knowledge. We create access, agency, control and understanding of complex AI systems which, when combined with inexpensive and easy-to-use hardware, create potent critical and long-term alternatives to invasive, biased and blackbox systems. We see this effort aligned with the idea of the Solar Mamas of the Barefoot College, India, where rural women were trained in solar electrification of their villages, empowering themselves and their extended community. We envision our project having a similar ripple effect - a joyous space for knowledge building and care-forward tinkering with technology.

Representative image

Representative image

View original image View dithered image

Premise and Urgent Context

Contemporary life is inextricably linked to massively scaled AI systems trained on biased datasets and hosted in energy-intensive data centers. As an alternative, can community-learned, solar-powered small language models create new forms of technological intimacy that resist extraction while fostering care?

Inspired by Solar Protocol, which they call ‘A Naturally Intelligent Network’, Sun Shines Bright’s proof of concept simultaneously explores low-energy and sustainable model hosting and opportunities and efficacies of Small Language Models. The early experiments (Jan-Mar 2025) and subsequent workshop at NYU Shanghai (April 2025) proved that building a simple solar-powered server for hosting AI models is not only possible but can be easily shared and explored with students and creatives from non-technology backgrounds. This was followed up by an exhibit and zine about Sun Shines Bright at AIxDESIGN Festival: On Slow AI, May 2025 and Chaos Feminist Convention July 2025. Several feminist non-technologist creators have expressed their interest in building their own solar servers. This reflects an urgent need for:

These models are climatically problematic and also do not reflect critical feminist non-white data. Now more than ever, we need camaraderie, friendship and intimacy as pillars of building agency and ease around fast-paced complex AI systems.

My son’s version of the Representative image

My son’s version of the Representative image

View original image View dithered image

Workshop NYU Shanghai

In April 2025, I was able to share the idea of hosting your own language models at Interactive Media Arts, NYU Shanghai through a small workshop!

While on its own, setting up the solar server’s software/server system is quite simple! What was challenging was to figure it out at the University, where restricted internet access and proxy system’s setup are part of state mandates. Anyway after a tough few hours, this was the simple system I figured out on the wireless network of the 3D Printer (which didn’t have access to the internet!)

learnt a lot of bash commands! :P

The workshop included:

  1. Accessing Raspberry Pi via SSH
  2. Installing Docker and Ollama
  3. Test Ollama with Tinyllama
  4. Use the API endpoint from the Rapsberry Pi to control a p5.js sketch! :)
participants SSH-ing into their Pis

participants SSH-ing into their Pis

View original image View dithered image

Sketch is hosted here!

computational mama talking about the solar installation in her home

computational mama talking about the solar installation in her home

View original image View dithered image
computational mama explains the various components needed for the project

computational mama explains the various components needed for the project

View original image View dithered image
computational mama explains the various components needed for the project

computational mama explains the various components needed for the project

View original image View dithered image
participants SSH-ing into their Pis

participants SSH-ing into their Pis

View original image View dithered image
participants SSH-ing into their Pis

participants SSH-ing into their Pis

View original image View dithered image

LLM can only bro :(

The quest was on to check general speed and lightness of Small Language Models. At the moment I’ve tried tinyllama and tinydolphin. As test I ask these four questions, the first two are a bit more generic, and 3 and 4 are closer to my research and experiments:

S.No. Question Answer should have been
1. What is the capital of India? New Delhi
2. Who was the emperor before Julius Caesar? Julius Caesar was not an emperor, he was a dictator. Before his reign, Rome was governed by a Senate. Also, Caesar’s heir Augustus was later the founder of the Rome, and it’s first Emperor.
3. Who is Gayatri Spivak? Gayatri Chakravorty Spivak (born 24 February 1942) is an Indian scholar, literary theorist, and feminist critic.[1] She is a University Professor at Columbia University and a founding member of the establishment’s Institute for Comparative Literature and Society
4. What is her most famous written piece Spivak is best known for her essay “Can the Subaltern Speak?” and her translation of and introduction to Jacques Derrida’s De la grammatologie.
Screenshot of Answer from Tinyllama

Screenshot of Answer from Tinyllama

View original image View dithered image
Screenshot of Answer from Tinydolphin

Screenshot of Answer from Tinydolphin

View original image View dithered image

Ollama on Docker

I started out by assuming that I would need to run ollama on docker too. Which was neatly described here: https://ollama.com/blog/ollama-is-now-available-as-an-official-docker-image

docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

If all goes well go to:

http://<<rpiname>>.local:11434/

You should see a simple message “Ollama is running”. This ran fine in the terminal, and then I also just straight up installed Ollama on the server. I don’t know if this was a wise choice but I just did it! :) The linux install documentation was very simple, and I think running any LLM applications from Ollama directly made sense to me. Especially if I want to try out some non-chat related projects like I have with Archive of Lost Mothers.

This meant I have to think about an alternative to the bundled Ollama and OpenWebUI docker thing.

Yay!

Eventually I found that I was able to run it, with adding the --network=host flag, more here about it.

The code I ran:

docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main

This worked fine and I was able to get the elegant OpenWebUI running on https://solarchat.cmama.xyz (this link won’t work without a login provided by me - I don’t want you to burn up my RPI!) Ping me if you want to test it. Ofcourse the computation for the webUI, and general limitation of the compute power means that the responses are slow.

Next Steps

On my to-dos with this experiment are:


  1. RASPBERRY PI probably can’t handle the training a langauge model, so this part of the experiment won’t be solar! ↩︎

Solar Realism

This was most daunting part of build the setup was to get all the power connections operating correctly. As it turns out, it took me like 10 minutes to set it up! The diagram will give you an idea of how it works!

This is how I set up the solar server on a small balcony

This is how I set up the solar server on a small balcony

View original image View dithered image
A wooden strip/shelf with the solar controller and a small shelf for the raspberry pi

A wooden strip/shelf with the solar controller and a small shelf for the raspberry pi

View original image View dithered image
I strapped in the raspberry pi with a cable organiser velcro.

I strapped in the raspberry pi with a cable organiser velcro.

View original image View dithered image
Connecting the battery to the Solar Charge Controller

Connecting the battery to the Solar Charge Controller

View original image View dithered image
Temporary setup for the solar panel, once I have more sense of the sun coverage, and the voltage draw I’ll fix up a more permanent solution

Temporary setup for the solar panel, once I have more sense of the sun coverage, and the voltage draw I’ll fix up a more permanent solution

View original image View dithered image

Bill of Materials

item cost
Solar Charge Controller 20A 549
Solar Panel 55w 2700
12V 8Ah Battery (bought locally) 890

Self-hosting and perils beyond

Last September, my colleague Kaustubh gave a wonderful talk about how he is self-hosting from an old laptop. He spoke very excitedly about his intentions with it, and how he achieved everything. Kaustabh technical know-how is exceptional, and he communicated the idea very simply.

It finally gave me a sense of what people say when they use the term “server”. I think it never struck me that it quite literally means “to serve”. So the server, serves websites or other computatational things!

After the talk, I was super excited and wondering what it might mean to host my own server, and if I could potentially do it with a rapsberry pi 4 that is lying around!

Solar dreams

In Oct/Nov 2024, I sat in on a info session from Rhizome.org, where a speaker explained a long standing project called Solar Protocol. This also hosts simple websites on a raspberry pi - the amazing part is that they use Solar power to keep everything running!

My combined excitement of selfhosting, raspberry pis and solar power source has kicked in hard in 2025 and this post is just a long form of things I did.

Yunohost and ISP woes!

After digging through basic resources like r/selfhosted and other lists. I found the documentation for Yunohost (pronounced Y-U-No-Host), very simple and easy to try. There was no code required for the setup. While I can code some and SSH into raspberry pis, I find that if the setup is easy, its a format/tool that I can share with others who aren’t technologists. To me arming with understanding technology makers is always as exhilirating learning something new myself.

Sure enough, this a simple image flash, YUNOHOST was installed and the admin panel was running as a browser interface!

The diagnostics tool told me that everything was looking fine aside from my port forwarding.

After some more youtube-ing, and weird bumps on the way, I was able to get all the port forwarding details in place. But nothing worked!

I called my ISP team, and they assured me it wasn’t on their end and all the ports were open! So I struggled a bit more, searching through the internet and youtube!

In the searches, two things kept coming back:

  1. A playlist by Pi-hosted which uses RaspiOS+portainer.io (a docker something) and,
  2. “Tunnels” instead of port forwarding

I kept resisting the solutions trying my best to get Yunohost up and running. Finally the ISP connected me to their senior networking engineer, and he explained that I have to buy the public static IP. This service is 2.5x the cost in Udaipur (perhaps as the ISP is local this request is not as common).

After a quick chat with Kaustubh about this, I decided to try a tunnel with Cloudflare, again I could’nt get this to work with Yunohost.

Apparently, Yunohost once installed and running with portforwarding will split the use of the localhost and the dedicated domain in two parts:

Back to drawing board

After all the Yunohost tears were wiped away. The steps were pretty clear:

  1. Install raspiOS headless (this was easy as I have tried this several times)
  2. Test the services I want to host like the LLM, Jellyfin etc.
  3. Host via docker to things are running neatly in containers (I’m still a noob on this stuff - but had to figure it out as best I could)
  4. Test the services on docker and decide what happens next
  5. Take the plunge into solar powering this shindig!
  6. See where this goes! :)

For this post, I will focus on steps 2, 3 and 4 (Step 5 is still a dream)!

NOTE: All this was done entirely on SSH.

Bro, can you even LLM?

Hosting an LLM on the Raspberry Pi was my main goal. After building Mother Machines v2 last year, I’ve been thinking a lot of local computing. There is something very empowering to discover that your computer (‘daily driver’) can actually generate AI images. Imagine if a tiny computer like a raspberry pi can run a chatbot!

The steps for this were:

  1. Run ollama with a simple and smolLM. I used Tinyllama! Choice of the model was PURELY based on the fact that it was called “Tiny”llama.
  2. Connect to openwebUI’s chat interface
  3. Test in local network
  4. Tunnel via Cloudflare!

All these steps worked quite well, but obviously they stop when close the terminal (I’m a forever noob with terminal things)

tinyllama

tinyllama

View original image View dithered image

Trying Docker

I have never used docker but it seemed to be an easy and well documented format for servers and in particular for Raspberry Pi as a server. You can find more about Raspberry Pi hosting here: https://pi-hosted.com/, by the amazing youtuber Nova spirit tech.

You can follow the instructions available on their website/youtube playlist!

Once I tested simpler Docker containers like Jellyfin, etc. I was interested in running Ollama+OpenwebUI as a docker container. There are several tutorials from this but I think the best solution to this was the most obvious one - using OpenwebUI’s README! (credit to sud who shared this link with me!)

This was very exciting - however I had no active cooling on my raspberry pi 4, and it was overheating the board (could smell some plastic burning!).

I’ll write more about the solar connection in the next post.

Page Size: $PLACEHOLDER