

Should just have it handle voting as well. They could call it Automatic Democracy.
None of that pesky informed voting, you can just instruct an AI company on what your stance is, and it’ll vote in your stead.


Should just have it handle voting as well. They could call it Automatic Democracy.
None of that pesky informed voting, you can just instruct an AI company on what your stance is, and it’ll vote in your stead.


According to the article linked in the article, it’s not that the operating system itself is more demanding, but more that the DE, and Browsers/Websites are more demanding now.
It feels like that Canonical basically needs to do the games thing of having a set of minimum specs for Ubuntu to run at all, and a recommend specs for Ubuntu to run well. Canonically basically bumped up the latter, but it’s being taken as the former.


It’s odd, since they used to have a rather nice HTML web interface specifically for low-peformance devices, but it’s since gone away.


This doesn’t seem so bad, though. 2 GB more in about 10 years is pretty reasonable in terms of an increase.
It’s not like they doubled it.


50 GB in memory for a visual studio/programming project being a bigger project seems like rather an understatement, unless you’re working on machine learning, simulations, or something of that nature.
Human drivers, if they could get LIDAR with their car, would probably also use it.
Why not aim for better than what humans can do?