• 4 Posts
  • 71 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle



  • This is fine. I don’t mind a diversity of opinion here. I agree that Proton is a stop-gap solution, and that most older games are going to need it, and newer AAA games are not going to support Linux all of a sudden.

    However, I do think that we should continue to encourage developers to create native builds when they can. Indie devs tend to do this and it’s a pretty great experience. Not only that, it often enables playing on unusual devices such as SBCs. For example, UFO 50 was made in Gamemaker, which offers native Linux builds, and it’s already on Portmaster. You basically can’t do that with Proton.

    My problem is calling people who want Linux native games misguided or wrong. I really don’t think that’s helpful.


  • I wish he wouldn’t repeat the idea that Proton is acceptable to game devs and Linux users shouldn’t demand native games. I’m much closer to Nick’s (from Linux Experiment) idea: That these games work as long as a company like Valve pays for Proton. The day Valve stops is the day these Proton games start to rot. For archival, for our own history, and for actual games on Linux, we should want Linux native games.

    The thing is, the “no tux no bucks” crowd doesn’t advocate for other people to say the same. The proton crowd is actively telling the “no tux no bucks” people to shut up, and it’s not very nice. We need a multitude of views to succeed in the long term as a community.
















  • I think it’s worth thinking about this in a technical sense, not just in a political or capitalist sense: Yes, car companies want self driving cars, but self driving cars are immensely dangerous, and there’s no evidence that self driving cars will make roads safer. As such, legislation should be pushing very hard to stop self driving cars.

    Also, the same technology used for self driving is used for AEB. This actually makes self-driving more likely, in that the car companies have to pay for all that equipment anyway, they may as well try and shoehorn in self driving. On top of this, I have no confidence that the odds of an error in the system (eg: a dirty sensor, software getting confused) is not higher than the odds of a system correctly braking when it needs to.

    This means someone can get into a situation where they are:

    • in a car, on a road, nothing of interest in front of them
    • the software determines that there is an imminent crash
    • Car brakes hard (even at 90mph), perhaps losing traction depending on road conditions
    • may be hit from behind or may hit an object
    • Driver is liable even though they never actually pressed the brakes.

    This is unacceptable on its face. Yes, cars are dangerous, yes we need to make them safer, but we should use better policies like slower speeds, safer roads, and transitioning to smaller lighter weight cars, not this AI automation bullshit.