Accessing the solar website:

Wilderland has two websites, one always-on, higher resolution website that is heavier to run, and one experimental, situated site, that is solar powered, and in the Nephin Park

The solar site might not always be on or available, it also has other quirks, and will be slower. Adjust your expectactions for this Permacomputing + Small Web alternative.

page

Permacomputing: workshop 3 — smolweb / Small websites

Notes and resources for of the third PMC workshop online on Wednesday 06/08/2025.

Header image credits: screenshot of the 'Avoid Software' publication spreads, Sarah Garcin & Quentin Juhel

schedule outline

  • 10.00: start
  • 10.00 - 10.30: particiant introduction, project introduction, Materials Matter and Wilderland, PhD intro
  • 10.30 to 11.00: part 1 - physical web
  • 11.00 to 11.30 part 2 - web publishing tools, heavy and light
  • 11.30 to 12.15 part 3 - inspecting websites
  • 12.15 to 13.00 - discussion, thoughts, further issues

Ways of doing

This workshop is meant to be enjoyable and relaxed. It wants to share information in a usable way and provide hands on experience. If you have any questions or concerns, please feel free to speak up, don’t be afraid to interrupt.

Project introductions

disclaimer + credits + acknowledgements

The content shared today is part of a developing artist research. It is shared in good faith, but do be careful when running terminal commands if you are not used to doing so. Also, each website is likely to be slightly different, so please take care when taking action to reduce website weight. The information that I share today was gathered over a number of years, from a number of sources. In somewhat of a chrono/importance order:

part 1 — making the web physical

Inspired by p.14 in Pasek’s zine, let’s use a few tools to see or remind ourselves what the web is made up of.

1. pick a website you know well

this can be your site, an old project, a particular bugbear, the site of an issue or anything you have a little bit of familiarity with.

2. open your terminal

For Windows users, this can be the ‘Command Prompt’ application (found by searching for ‘CMD’) or the ‘PowerShell’ if your system has that. For Mac Users, search for the ‘Terminal’ utility. Linux users, it will be similar to Mac users, but you are likely already familiar with the terminal.

3. determine the site’s ip address

determine the site’s ip address by typing ping [thedomainofthesite.com] and hitting enter. For example ping climatejusticeuniversitiesunion.org should produce a result like so:

to stop pinging, hit CTRL + C

4. note the IP address

or your ping query, for my example, the ip address of the climatejusticeuniversitiesunion.org site is 198.185.159.145

5. rough ‘geolocation’

we can now do a very rough ‘geolocation’ of the site by running this next command:
curl ipinfo.io/198.185.159.145

6. Lat Lon!

the result of that last command will look something like this:

{
“ip”: “198.185.159.145”,
“city”: “New York City”,
“region”: “New York”,
“country”: “US”,
“loc”: “40.7339,-74.0054”,
“org”: “AS53831 Squarespace, Inc.”,
“postal”: “10014”,
“timezone”: “America/New_York”,
“readme”: “https://ipinfo.io/missingauth”,
“anycast”: true
}

The “loc” line gives us what we have been looking for. A latitude and longitude of where this IP address actually is in the world. This is a rough geolocation, it is not exacly accurate, but it give you an idea. The lat lon 40.7339,-74.0054 actually puts us in the West Village of Manhattan. This is more likely to be the HQ of Squarespace, not necessarily the data centre itself, but some of their computational infrastructure might indeed be in an office in the West Village. Your mileage may vary, depending on the website you are looking up.

7. using traceroute

To get a more accurate picture of where the data / website / components may be, we can use ‘traceroute’. Traceroute shows the journey your requests have to take to resolve.


On Windows run: tracert [your ip address]


On Mac run: traceroute [your ip address]


On Linux I run: sudo traceroute -T [the ip address]

8. reading the traceroute results

Hop Host / Info Explanation
1 _gateway (192.168.1.1) Your local router or home gateway
2 lo1001.bas103.bmt.btireland.net (193.95.131.39) ISP (BT Ireland) entry point
3-6 Various BT Ireland core routers Traversing BT Ireland’s network core
7 166-49-168-130.gia.bt.net BT Global Internet Access router
8 t2c4-et-5-1-5.uk-lof.gia.bt.net UK-based BT transit node (likely London)
9 akamai.prolexic.com (195.66.224.31) Akamai’s DDoS protection / CDN entry
10-11 akamaitechnologies.com nodes Akamai CDN edge servers
12 * * * Silent hop — no response (normal)
13-14 198.185.159.145 Destination (Squarespace server) reached

You can also use tools like https://traceroute-online.com/ to get visual results for your traceroutes.

part 2 — how websites are made, CMS tools, publishing tools, hosting tools

Before considering the pros and cons of various publishing methods, a small theoretical detour — Decolonial computing: ‘a critical project, which is about interrogating who is doing computing, where they are doing it, and, thereby, what computing means both epistemologically (i.e. in relationship to knowing) and ontologically (i.e. in relationship to being).’ Ali, 2016

If you wish to dig further into the decolonial computing idea, some geopolitical/social justice focused tech writings to consider might be:


Hand-Made Website Static Site Generators (SSGs) LAMP/WAMP Stack MEAN/MEVN Sites
Linux, Apache, Mysql, php MongoDB, Express, Angular/Vue.js, Node.js
Manually written HTML/CSS/JS files; no frameworks or automation. Use templates and markdown to build pre-rendered static sites. Traditional web stack, was the standard for many years. Modern/Excessive full-stack JavaScript frameworks for dynamic web apps.
Examples: HTML5, CSS3, Vanilla JS Examples: Jekyll, Hugo, Eleventy, Gatsby Examples: WordPress, Joomla, Drupal Examples: Facebook, Twitter, Instagram, Ryanair.com, large sites

Moving from left to right, it is fair to say that the technical requirements increase, so does the weight of the website, and therefore, so does the ecological impact. Using various database formats also makes your content harder to migrate from one system to another. However, there are some tools that exist to aid in this process:

part 3 — reducing the weight of websites

Considering where and how your site is published is only one dimension of the issue. The media, content and assets that make up the front-end of the page are also essential considerations. In this section, we discuss tips and tricks for reducing website front-end weight.

wld-inspector.png


65 requests, 21.41MB

remote-wld-inspector.png


7 requests, 395kB


  • media compression & image dithering

Reducing the size of the webpages online medias is one of the best and fastest ways to reduce the size of your website. For batch procesing, you might consider ImageMagick. For that crispy low-bandwidth dithered permacomputing aesthetic, I dither the remote website images with this shell script:

for img in *.jpg; do
  echo "$img"
  magick "$img" -resize 50% -ordered-dither o8x8,5,5,4 "$img"
done