new OS based on nix :3
Go to file
2023-08-24 07:34:14 +00:00
.dir-locals.el Add dirlocal to enforce tabs (Sorry spaces users, you are wrong) 2023-03-30 14:12:40 -07:00
.gitignore Ignore qcow2 2023-03-29 21:35:19 -07:00
LICENSE change license to lgpl ( with consent of other comitters ) 2023-04-21 22:56:10 +00:00
README.md cleanup and edit README 2023-07-02 04:33:41 +00:00
TODO.md added TODOs, for testing different options 2023-06-29 19:22:24 +00:00

saturnOS

NOTE: This repository is now a meta-repository to track builds of the OS and meta-issue with the OS and project, not relevant to any part of the system individually, the rest of the README will be updated occordingly soon

ok so there's been way too many failed OS concepts that have tried to be convergent, i could list some of them but i'm sure you all know of examples . the biggest problem as i see it is they try to use the same app for all platforms and that's not only too much stress on the dev it's also just gonna always be a worse output, plus this is nearly imossible to scale to vr/ar where the idea of a flat window is an ugly holdover from the 2d ancestors it comes from .

so what's the solution ? breaking apps apart into component pieces, and swapping them out at runtime .

apps now have three layers:

  1. the data layer, where the state of the application is held
  2. the layout engine, which decides where components should be
  3. the renderer which can be swapped out to be more system-native

on saturn, 2 and 3 will be the same thing for system applications, as it'll be managed by the system, but the data layers could still be swapped out easily, and so could layers 2 or 3 if necesarry .

but what exactly does this look like in practice ?

let's take a music application as an example .

The data layers

so firstly, you have the data layers of this application which will ofcourse come with a filesystem layer, which will scan your ~/Music folder for music, but the appstore will have plugins for the application to allow it to grab music from other sources such as youtube, spotify, etc, and display all the songs you have in the same library, and all play seamlessly . ( the basic pitch for scopes )

ok that's great, but how does it interact with the other layers ?

The layout engine

the layout engine seems pretty simple at first but it's actually critical for this to work, and has a pretty complex function depending on the platform .

while the layout engine will react to changes in form factor like taking a phone out of a headset, or plugging in a mouse and keyboard and display, it will also just handle changes in window size . but to allow for this type of flexibility, it needs to be able to create an abstract layout that can work in both 2D and 3D, the specifics of which i haven't entitrely fleshed out yet, but the basic concept is that it will be as specific as it needs to be . it will describe a type of object, such as a list object, its members, and how everything is branched together in a tree-like structure .

but how does this abstract data get turned into a graphical interface ?

The rendering engine

the job of the rendering engine is pretty in theory pretty simple . it takes the structure of the UI, and it builds an image . but this is not really a simple process for all of the varying form factors the application needs to support . for VR for example, it would need to have a set of widgets which are interesting and interactive . list should have physicality, scroll boxes should be wheels ! the snip tool should be a pair of scisors you can hold and interact with, it's just a widget toolkit, but it's going to be a widget toolkit you can feel and play with . on 2D devices, it will loose a lot of its skeumorphic interactivity, but that's still going to have to be able to work and perform well on devices of varying screen size and interaction style, so it's still a bit og a challenge .

ok so overall it's a lot of work but i have faith that this type of model would be not only really easy to develop apps for, as for some applications its as simple as writing a data layer for it, but it allows the developer to only write as much of the application as they need to, as they can just let the system handle most of it for them .

Development

to check for syntax errors, run nix repl replEval.nix and once inside the repl, run imported to check for syntax errors

Setup for not nixos

Install nix on your system following the instructions here.

Add the nixos 22.11 channel to your system, since we're basing on the latest stable branch for now. nix-channel --add https://nixos.org/channels/nixos-22.11 nixos

Enable flakes, as we're using a flake for our build process.

mkdir -p ~/.config/nix
echo "experimental-features = nix-command flakes" >> ~/.config/nix/nix.conf

Building and Running

We are using the nixos-generators package to help coordinate builds.

VM

To build the VM: nix run github:nix-community/nixos-generators -- -c default.nix -f vm -o result/vm This will build the VM, and register the result in the result/vm symlink.

To run the VM, use sudo result/vm/bin/run-qemu-vm. (sudo can probably be avoided if you are in the KVM group... I think. TODO Lambda look this up)

You can combine the two steps with the following: sudo $(nix run github:nix-community/nixos-generators -- -c default.nix -f vm -o result/vm)

Or to build and run without registering a link (Good if you want a throw away VM which can be garbage collected): sudo $(nix run github:nix-community/nixos-generators -- -c default.nix -f vm)