RunPod
#artificial_intelligence #cloud_computing #technology
provides gpu renting
there are two models:
- secure cloud: first party RunPod hardware
- community cloud: decentralized compute, they vet and provide that at (supposedly) cheaper rates
they buy a server, stuff it full of gpus, and slice out docker containers tthat run a certain image, install sshd on it, and you get in and run your compute. pretty interestting, coming from a world where cloud providers basically run VMs. that might be good to lessen overheard (and setup overhead, most likely, as you'd be able to ln -s /dev/DRIxxx /path/to/docker/ct/dev/DRIxxx
, right...?)
there are other competitors in the space:
- https://vast.ai
- https://lambdalabs.com/service/gpu-cloud
- https://cloud.jarvislabs.ai/
- https://coreweave.com
- https://www.paperspace.com/
pretty interesting to see all of these appear, though it's more of me learning they even exist. interest skyrocketed thanks to the ai boom from DALL-E (with Stable Diffusion stealing the spotlight, which is DESERVED. fuck closed models)