I love installing new repos and experimenting with new code. However, my enthusiasm is sometimes dampened by the 1 to 2 week installation process.
For instance, on a recent repo, I had to make my windows PC a dual-boot system because the software required “bare metal” Linux. I purchased a second internal hard drive, then installed the exact required version of Linux, then the exact required version of Tensorflow GPU with the exact version of CUDA and the exact version of Python and Unity.
If I install another repo of similar complexity, I will likely have to install another version of Linux, Tensorflow, and Python, as well as the package dependencies. Having virtual environments to segregate the dependencies isn’t enough; I’d like to see us get to the point where we can have a “one-click” installation of even complicated programs with many dependencies, so even newbies can install intricate projects.
You say that you had to make your Windows PC dual-boot because “the
software” required “bare metal Linux”.
What software are you referring to? Surely not Python, it runs happily
on Windows. If not Python, but some other software, what do you expect
us to do about third-party software we don’t control?
What do you mean by “bare metal”? To me, that term applies to running
software without an operating system. You might mean something else,
since you were running Linux.
Tensorflow runs on Windows; CUDA runs on Windows; Python runs on
Windows; the Unity game engine runs on Windows. So it isn’t clear to me
why you needed Linux or why it specifically had to be a dual boot system
rather than running in a VM, and it certainly isn’t clear to me why it
is Python’s responsibility to install the full stack of software
including the operating system.
I’m not unsympathetic to your problem, it is just that you seem to be
venting your frustrations rather than giving sufficient detail to allow
constructive discussion to take place.