#artificial_intelligence #cloud_computing #technology

provides gpu renting

there are two models:

they buy a server, stuff it full of gpus, and slice out docker containers tthat run a certain image, install sshd on it, and you get in and run your compute. pretty interestting, coming from a world where cloud providers basically run VMs. that might be good to lessen overheard (and setup overhead, most likely, as you'd be able to ln -s /dev/DRIxxx /path/to/docker/ct/dev/DRIxxx, right...?)

there are other competitors in the space:

pretty interesting to see all of these appear, though it's more of me learning they even exist. interest skyrocketed thanks to the ai boom from DALL-E (with Stable Diffusion stealing the spotlight, which is DESERVED. fuck closed models)