23 Commits

Author SHA1 Message Date
e45e7960da Merge pull request 'remove qwen3.5-coder-next' (#200) from feature/ollama-local into main
All checks were successful
Check flake.lock / Check health of `flake.lock` (push) Successful in 9s
Check Nix flake / Perform Nix flake checks (push) Successful in 5m42s
Reviewed-on: #200
2026-03-27 01:38:11 -04:00
0d5bf7e46d remove qwen3.5-coder-next
All checks were successful
Check flake.lock / Check health of `flake.lock` (pull_request) Successful in 14s
Check Nix flake / Perform Nix flake checks (pull_request) Successful in 6m57s
2026-03-27 01:24:59 -04:00
83c7ef00ad Merge pull request 'automated: Update flake.lock' (#186) from update-flake-lock into main
All checks were successful
Check flake.lock / Check health of `flake.lock` (push) Successful in 8s
Check Nix flake / Perform Nix flake checks (push) Successful in 23m3s
Reviewed-on: #186
2026-03-26 22:03:23 -04:00
github-actions[bot]
1fbe15c0a0 automated: Update flake.lock
All checks were successful
Check flake.lock / Check health of `flake.lock` (pull_request) Successful in 15s
Check Nix flake / Perform Nix flake checks (pull_request) Successful in 4m38s
Auto-generated by [update.yml][1] with the help of
[create-pull-request][2].

[1]: https://nayeonie.com/ahuston-0/nix-dotfiles/src/branch/main/.github/workflows/flake-update.yml
[2]: https://forgejo.stefka.eu/jiriks74/create-pull-request
2026-03-26 21:58:32 -04:00
a923f4fd82 Merge pull request 'feature/zed-models' (#199) from feature/zed-models into main
All checks were successful
Check flake.lock / Check health of `flake.lock` (push) Successful in 8s
Check Nix flake / Perform Nix flake checks (push) Successful in 3m57s
Reviewed-on: #199
2026-03-26 21:57:49 -04:00
802bf1ca4c add ollama back to artemision
All checks were successful
Check flake.lock / Check health of `flake.lock` (pull_request) Successful in 6s
Check Nix flake / Perform Nix flake checks (pull_request) Successful in 15m11s
2026-03-26 21:42:13 -04:00
b5e45398d8 3 parallel models 2026-03-26 21:42:13 -04:00
623cad29a7 Merge pull request 're-add ollama to configuration' (#198) from feature/zed-models into main
All checks were successful
Check flake.lock / Check health of `flake.lock` (push) Successful in 9s
Check Nix flake / Perform Nix flake checks (push) Successful in 2m32s
Reviewed-on: #198
2026-03-26 13:02:30 -04:00
fde7963379 re-add ollama to configuration
All checks were successful
Check flake.lock / Check health of `flake.lock` (pull_request) Successful in 11s
Check Nix flake / Perform Nix flake checks (pull_request) Successful in 2m33s
2026-03-26 12:32:25 -04:00
e0f2f64886 Merge pull request 'ollama stuff' (#197) from feature/fwupd into main
All checks were successful
Check flake.lock / Check health of `flake.lock` (push) Successful in 8s
Check Nix flake / Perform Nix flake checks (push) Successful in 2m42s
Reviewed-on: #197
2026-03-26 12:27:26 -04:00
0036acbde3 devstral
All checks were successful
Check flake.lock / Check health of `flake.lock` (pull_request) Successful in 10s
Check Nix flake / Perform Nix flake checks (pull_request) Successful in 3m1s
2026-03-26 12:23:54 -04:00
ecdf223431 new models
All checks were successful
Check flake.lock / Check health of `flake.lock` (pull_request) Successful in 10s
Check Nix flake / Perform Nix flake checks (pull_request) Successful in 3m14s
2026-03-26 11:40:57 -04:00
6e6a8a205a remove gnome tools
All checks were successful
Check flake.lock / Check health of `flake.lock` (pull_request) Successful in 9s
Check Nix flake / Perform Nix flake checks (pull_request) Successful in 2m59s
2026-03-26 11:37:11 -04:00
342ff16158 ollama stuff
All checks were successful
Check flake.lock / Check health of `flake.lock` (pull_request) Successful in 7s
Check Nix flake / Perform Nix flake checks (pull_request) Successful in 3m16s
2026-03-26 11:32:04 -04:00
c75b754ace Merge pull request 'feature/fwupd' (#196) from feature/fwupd into main
All checks were successful
Check flake.lock / Check health of `flake.lock` (push) Successful in 8s
Check Nix flake / Perform Nix flake checks (push) Successful in 6m1s
Update flakes / update_lockfile (push) Successful in 11m57s
Reviewed-on: #196
2026-03-23 23:24:37 -04:00
de45a27860 extend context
All checks were successful
Check flake.lock / Check health of `flake.lock` (pull_request) Successful in 9s
Check Nix flake / Perform Nix flake checks (pull_request) Successful in 7m8s
2026-03-23 23:17:14 -04:00
3557b88d7c ollama
All checks were successful
Check flake.lock / Check health of `flake.lock` (pull_request) Successful in 1m14s
Check Nix flake / Perform Nix flake checks (pull_request) Successful in 4m15s
2026-03-23 23:14:05 -04:00
67e4dc15e7 llama 4 scout 2026-03-23 23:00:51 -04:00
291a15d0c5 Merge pull request 'feature/fwupd' (#195) from feature/fwupd into main
All checks were successful
Check flake.lock / Check health of `flake.lock` (push) Successful in 11s
Check Nix flake / Perform Nix flake checks (push) Successful in 2m23s
Reviewed-on: #195
2026-03-23 22:20:20 -04:00
7034b651f8 set world seed, add new models
All checks were successful
Check flake.lock / Check health of `flake.lock` (pull_request) Successful in 41s
Check Nix flake / Perform Nix flake checks (pull_request) Successful in 3m13s
2026-03-23 22:04:13 -04:00
9c5aaca961 ollama models 2026-03-23 22:04:13 -04:00
c0d6a20780 Merge pull request 'make ollama and open-webui available on the local network for now' (#194) from feature/fwupd into main
All checks were successful
Check flake.lock / Check health of `flake.lock` (push) Successful in 13s
Check Nix flake / Perform Nix flake checks (push) Successful in 5m58s
Reviewed-on: #194
2026-03-23 21:02:58 -04:00
e8228616fb make ollama and open-webui available on the local network for now
All checks were successful
Check flake.lock / Check health of `flake.lock` (pull_request) Successful in 9s
Check Nix flake / Perform Nix flake checks (pull_request) Successful in 2m57s
2026-03-23 20:48:10 -04:00
9 changed files with 145 additions and 58 deletions

36
flake.lock generated
View File

@@ -76,11 +76,11 @@
},
"locked": {
"dir": "pkgs/firefox-addons",
"lastModified": 1774238582,
"narHash": "sha256-Ki8cqI4709KnKyR5EbMMbtsc4k3vSP7KeCTAhBRZ640=",
"lastModified": 1774497795,
"narHash": "sha256-tzgxKaCEMcU6XT0fjV/vEqDCM9yij6wBgPPBKiK8Dfk=",
"owner": "rycee",
"repo": "nur-expressions",
"rev": "1b4ad32c889411e7df7e9c88246e39c9407eae1f",
"rev": "11af6f465a038233b8123022dcb7e293f3229f11",
"type": "gitlab"
},
"original": {
@@ -242,11 +242,11 @@
]
},
"locked": {
"lastModified": 1774210133,
"narHash": "sha256-yeiWCY9aAUUJ3ebMVjs0UZXRnT5x90MCtpbpOWiXrvM=",
"lastModified": 1774379316,
"narHash": "sha256-0nGNxWDUH2Hzlj/R3Zf4FEK6fsFNB/dvewuboSRZqiI=",
"owner": "nix-community",
"repo": "home-manager",
"rev": "c6fe2944ad9f2444b2d767c4a5edee7c166e8a95",
"rev": "1eb0549a1ab3fe3f5acf86668249be15fa0e64f7",
"type": "github"
},
"original": {
@@ -417,11 +417,11 @@
},
"nixos-hardware": {
"locked": {
"lastModified": 1774018263,
"narHash": "sha256-HHYEwK1A22aSaxv2ibhMMkKvrDGKGlA/qObG4smrSqc=",
"lastModified": 1774465523,
"narHash": "sha256-4v7HPm63Q90nNn4fgkgKsjW1AH2Klw7XzPtHJr562nM=",
"owner": "NixOS",
"repo": "nixos-hardware",
"rev": "2d4b4717b2534fad5c715968c1cece04a172b365",
"rev": "de895be946ad1d8aafa0bb6dfc7e7e0e9e466a29",
"type": "github"
},
"original": {
@@ -502,11 +502,11 @@
},
"nixpkgs_2": {
"locked": {
"lastModified": 1774106199,
"narHash": "sha256-US5Tda2sKmjrg2lNHQL3jRQ6p96cgfWh3J1QBliQ8Ws=",
"lastModified": 1774386573,
"narHash": "sha256-4hAV26quOxdC6iyG7kYaZcM3VOskcPUrdCQd/nx8obc=",
"owner": "nixos",
"repo": "nixpkgs",
"rev": "6c9a78c09ff4d6c21d0319114873508a6ec01655",
"rev": "46db2e09e1d3f113a13c0d7b81e2f221c63b8ce9",
"type": "github"
},
"original": {
@@ -596,11 +596,11 @@
]
},
"locked": {
"lastModified": 1774235565,
"narHash": "sha256-D8OOwvq3zDDCtIhMcNueb9tGSZaZUanKpWDleRgQ80U=",
"lastModified": 1774494762,
"narHash": "sha256-lt22GCJZ6qBQLgNZZl3S/RUjTLXTlEy0Fn0sqMttLxQ=",
"owner": "oxalica",
"repo": "rust-overlay",
"rev": "dc00324a2438762582b49954373112b8eab29cab",
"rev": "ce3b3a61ebf28670dfc8b97eb35ed9e24474a2cf",
"type": "github"
},
"original": {
@@ -616,11 +616,11 @@
]
},
"locked": {
"lastModified": 1774154798,
"narHash": "sha256-zsTuloDSdKf+PrI1MsWx5z/cyGEJ8P3eERtAfdP8Bmg=",
"lastModified": 1774303811,
"narHash": "sha256-fhG4JAcLgjKwt+XHbjs8brpWnyKUfU4LikLm3s0Q/ic=",
"owner": "Mic92",
"repo": "sops-nix",
"rev": "3e0d543e6ba6c0c48117a81614e90c6d8c425170",
"rev": "614e256310e0a4f8a9ccae3fa80c11844fba7042",
"type": "github"
},
"original": {

View File

@@ -46,10 +46,10 @@
kubernetes
];
# Enable containerd for Kubernetes
virtualisation.containerd.enable = true;
## Enable containerd for Kubernetes
#virtualisation.containerd.enable = true;
# Enable kubelet
## Enable kubelet
#services.kubelet = {
# enable = true;
# extraFlags = {

View File

@@ -1,6 +1,7 @@
{
lib,
pkgs,
config,
...
}:
{
@@ -17,6 +18,7 @@
./stylix.nix
./wifi.nix
./zerotier.nix
../palatine-hill/ollama.nix
];
time.timeZone = "America/New_York";
@@ -39,6 +41,19 @@
sops.age.sshKeyPaths = [ "/etc/ssh/ssh_host_ed25519_key" ];
services = {
ollama = {
package = lib.mkForce pkgs.ollama-rocm;
models = lib.mkForce "${config.services.ollama.home}/models";
loadModels = lib.mkForce [
"deepseek-r1:1.5b"
"lennyerik/zeta"
"nomic-embed-text:latest"
"glm-4.7-flash"
"magistral"
"devstral-small-2"
"starcoder2:7b"
];
};
flatpak.enable = true;
calibre-web = {
# temp disable this

View File

@@ -27,6 +27,7 @@
fd
file
firefox
# gestures replacement
git
glances
@@ -34,12 +35,8 @@
grim
htop
hwloc
ipmiview
iperf3
# ipscan
javaPackages.compiler.temurin-bin.jdk-25
javaPackages.compiler.temurin-bin.jdk-21
javaPackages.compiler.temurin-bin.jdk-17
jp2a
jq
kdePackages.kdenlive

View File

@@ -17,7 +17,6 @@
./minio.nix
./networking.nix
./nextcloud.nix
./ollama.nix
#./plex
./postgresql.nix
./samba.nix
@@ -26,10 +25,8 @@
programs.git.lfs.enable = false;
nixpkgs.config = {
packageOverrides = pkgs: {
vaapiIntel = pkgs.vaapiIntel.override { enableHybridCodec = true; };
};
nixpkgs.config.packageOverrides = pkgs: {
vaapiIntel = pkgs.vaapiIntel.override { enableHybridCodec = true; };
};
boot = {
@@ -100,13 +97,6 @@
smartd.enable = true;
calibre-server.enable = false;
# Kubernetes example configuration
# To enable Kubernetes, uncomment the following:
# kubernetes = {
# enable = true;
# clusterName = "palatine-hill-cluster";
# controlPlaneEndpoint = "localhost:6443";
# };
};
nix.gc.options = "--delete-older-than 150d";

View File

@@ -93,10 +93,11 @@ in
CF_FILENAME_MATCHER = "1.11.2";
USE_AIKAR_FLAGS = "false";
USE_MEOWICE_FLAGS = "true";
DIFFICULTY = "hard";
DIFFICULTY = "peaceful";
ENABLE_COMMAND_BLOCK = "true";
INIT_MEMORY = "4G";
MAX_MEMORY = "16G";
SEED = "-7146406535839057559";
};
extraOptions = defaultOptions;
log-driver = "local";

View File

@@ -12,39 +12,45 @@ in
package = pkgs.ollama;
syncModels = true;
loadModels = [
"gemma3"
"deepseek-r1:latest"
"deepseek-r1:1.5b"
"qwen3"
#"qwen3-coder-next"
"qwen3-coder"
"deepseek-r1:32b"
"deepseek-r1:70b"
#"qwen3"
#"qwen3.5:latest"
"qwen3-coder-next"
"lennyerik/zeta"
"llama3.1:8b"
"qwen2.5-coder:1.5b-base"
"nomic-embed-text:latest"
"lfm2:24b"
"glm-4.7-flash"
"nemotron-cascade-2:30b"
"magistral"
"devstral-small-2"
"starcoder2:15b"
];
models = vars.primary_ollama;
environmentVariables = {
FLASH_ATTENTION = "1";
OLLAMA_KV_CACHE_TYPE = "q8_0";
OLLAMA_KV_CACHE_TYPE = "q4_0";
# Ollama memory configuration
OLLAMA_MAX_LOADED_MODELS = "2";
OLLAMA_MAX_QUEUE = "4";
OLLAMA_NUM_PARALLEL = "2";
OLLAMA_MAX_LOADED_MODELS = "3";
OLLAMA_MAX_QUEUE = "512";
OLLAMA_NUM_PARALLEL = "1";
# ROCm memory optimization
#HIP_VISIBLE_DEVICES = "0";
#ROCR_VISIBLE_DEVICES = "0";
# context length for agents
OLLAMA_CONTEXT_LENGTH = "64000";
OLLAMA_CONTEXT_LENGTH = "128000";
};
openFirewall = true;
host = "0.0.0.0"; # don't want to make this available via load-balancer yet, so making it available on the local network
};
open-webui = {
enable = true;
port = 21212;
openFirewall = true;
host = "0.0.0.0"; # don't want to make this available via load-balancer yet, so making it available on the local network
};
};
users.users.ollama = {

View File

@@ -31,7 +31,6 @@
grim
htop
hwloc
ipmiview
iperf3
# ipscan
jp2a

View File

@@ -63,8 +63,81 @@
"latex"
"terraform"
"log"
"context7-mcp-server"
"github-mcp-server"
];
userSettings = {
context_servers = {
nixos = {
command = "nix";
args = [
"run"
"github:utensils/mcp-nixos"
"--"
];
};
};
language_models = {
ollama = {
api_url = "http://192.168.76.2:11434";
context_window = 128000;
# global keep alive doesnt work
#keep_alive = "15m";
available_models = [
{
name = "deepseek-r1:1.5b";
max_tokens = 128000;
keep_alive = "15m";
}
{
name = "deepseek-r1:32b";
max_tokens = 128000;
keep_alive = "15m";
}
{
name = "deepseek-r1:70b";
max_tokens = 128000;
keep_alive = "15m";
}
{
name = "qwen3-coder-next";
max_tokens = 128000;
keep_alive = "15m";
}
{
name = "lennyerik/zeta";
max_tokens = 128000;
keep_alive = "15m";
}
{
name = "nomic-embed-text:latest";
max_tokens = 128000;
keep_alive = "15m";
}
{
name = "lfm2:24b";
max_tokens = 128000;
keep_alive = "15m";
}
{
name = "glm-4.7-flash";
max_tokens = 128000;
keep_alive = "15m";
}
{
name = "nemotron-cascade-2:30b";
max_tokens = 128000;
keep_alive = "15m";
}
{
name = "magistral";
max_tokens = 128000;
keep_alive = "15m";
}
];
};
};
colorize_brackets = true;
hard_tabs = false;
vim_mode = true;
@@ -77,7 +150,7 @@
agent = {
default_model = {
provider = "ollama";
model = "qwen2.5-coder:latest";
model = "glm-4.7-flash";
};
favorite_models = [ ];
model_parameters = [ ];
@@ -89,13 +162,16 @@
journal = {
hour_format = "hour24";
};
edit_preditions = {
provider = "open_ai_compatible_api";
open_ai_compatible_api = {
api_url = "http://localhost:11434/v1/completions";
model = "zeta:latest";
prompt_format = "infer";
edit_predictions = {
provider = "ollama";
ollama = {
#api_url = "http://192.168.76.2:11434/v1/completions";
api_url = "http://192.168.76.2:11434";
context_window = 128000;
model = "lennyerik/zeta";
prompt_format = "qwen";
max_requests = 64;
max_output_tokens = 256;
};
};
texlab = {
@@ -144,6 +220,8 @@
# markdown
nodePackages.markdownlint-cli
# insert essential rust dependencies
# doom emacs dependencies
yaml-language-server
nodePackages.typescript-language-server
@@ -184,5 +262,6 @@
# arch zed deps
nixd
uv
];
}