It's all good!!! I did the same thing and used jlesage's Dockerfile so feel free. I2c_core 40960 4 nvidia,drm_kms_helper,drm,ipmi_ssif Nvidia 16510976 2 nvidia_uvm,nvidia_modesetĬryptd 20480 3 ghash_clmulni_intel,aesni_intel,crypto_simd Ip_tables 24576 5 iptable_mangle,iptable_nat,iptable_filter Nf_nat_ipv4 16384 2 ipt_MASQUERADE,iptable_nat Just be aware it might bump the image size considerably! BUT you can strip out some of the bulk by not installing the extras that aren't needed.Įdit: Just thought - that when / if you do this you might need to add some documentation that the Docker host will also require the same libraries / drivers installing.įor certain OSs this may already exist as downloadable drivers + the nvidia docker container runtime ( Ubuntu etc.) or as a plugin ( Unraid) I am less familiar on how they can pull it from the Nvidia Dockers rather than installing outright but documentation for this is here?. One example of a docker image being built with the CUDA libraries is here: It should therefore be relatively straightforward to include them but it depends on what your docker base OS is? I've not tried with Alpine! Can confirm I would certainly also desire this feature - I know that the Emby and Plex dockers already include the required drivers / libraries as they use NVENC to accelerate transcoding - the libraries are available!
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |