The name is OpenLara (https://github.com/XProger/OpenLara ) and you can try out the WebGL build directly on your web browser on: http://xproger.info/projects/OpenLara/ . The web version works amazingly well on my Pixel 7a with touch controls (you have to click on the “go fullscreen” button) using Firefox as a browser.
Yet, it is a very not-really-good idea to run stuff on a web browser. Web browsers are a notoriously insecure, slow platform with controls (“Back”, “Reload”, …) which are not optimized to run applications.
edit: I did not expect that the “modern web” crowd would now come here to berate (and downvote) me for the sacrilege of not unconditionally considering web browsers to be the very best piece of software for every purpose. My fault, sorry. I’m out of here, this is pointless.
Tell that to Google with Google Docs, Microsoft with Office365, etc. The web applications are starting to become a thing in a big way.
When exactly has “large companies do that” become a good reason instead of a warning?
I agree. Every company I’ve worked for recently has been migrating to web based applications, which has DEFINITELY been fun.
I don’t think that “fun” should be the only relevant aspect, especially not with network-facing applications managing personal data.
I was being sarcastic. Should’ve included the /s
Let me get this straight, you think running something in a browser with its sandboxed design, is somehow less secure than downloading executables off of GitHub?
Yes, because browser sandboxes will NEVER be as secure as kernel sandboxes.
(Run the browser in the kernel sandbox)
Touché.
I can check and validate the code I download from GitHub before I compile and run it. And I can be sure that the binary I compiled will always be the same. All that is not true with web apps, I can’t check the code before running (maybe I could with JavaScript but not with WebASM) and as the code gets delivered on the fly it always could be changed either on the server or by a third person in transit (TLS is not a impenetrable barrier, not with a default trusted authentication provider list that huge in all browsers).
That alone puts browser based application in a much higher risk category.
And when it comes to binaries: I can analyse those before running if I wanted to, again something I can’t with dynamic delivered code in the browser.
What’s the biggest code base you have ever reviewed? What’s the most recent TLS vulnerability you have encountered, as opposed to the last vulnerability in other parts of your OS? Code being swapped by the server, maybe, but are you saying you do a code review every time you update a package or dependency of some other project? This is only less secure in some inconceivably convoluted chain of events that no practical person could enact. No sane person does what you’re saying. Everyone has to trust someone else with code blindly at some point.
Yeah, Man in the middle attacks are completely uncommon and have never happened. You don’t need vulnerabilities in TLS itself but there are plenty of those, check the CVE list for 2023 alone: https://www.openssl.org/news/vulnerabilities.html#y2023
You only need a access to a valid certificate authority, no issue for any state actor for example, to interrupt the chain. Yes, there are mechanisms against that but those are so far not really common yet unfortunately.
And I never said that I do code audits, only that I have the possibility to do it.
The web version I find it cool from a technological perspective and while I agree with you on the slowness of Javascript apps vs native apps in general, WebAssembly is a completely different beast and you can have very good performance with it while benefiting from all the portability advantages, it’s basically what Java tried to become with applets back then. It’s cool because you can also have stuff like a whole Jupyter Notebook run locally in your browser without installing anything, which can be useful when you have to teach Python and you don’t want to deal with students not installing their stuff before the class begins: https://github.com/jupyterlite/jupyterlite .
FYI, the OpenLara GitHub repo I linked to also has native builds. In particular, it hilariously has a Gameboy Advance build: https://www.youtube.com/watch?v=_GVSLcqGP7g
Nowadays web browser are so much more then that, with tools like webasm, web usb, web bluetooth, gamepad API, web gpu and all that we are far away from the slow platforms with limited controls of the ol’ days.
The list of modern API is almost endless https://developer.mozilla.org/en-US/docs/Web/API
Exposing your hardware over JavaScript sounds dangerous to me, to be honest. But well, I’m sure that nothing bad could ever happen.
Web browsers are still much slower than your kernel.
I never said that it would be save, I purposely left that out. I am not a fan of Webapps and Games running in Web browsers myself, at all.
But it can be a valid option, for everyone not as paranoid as me
If there’s one thing that everyone could have learned from the Snowden papers, that one thing is that you aren’t “paranoid”.
That’s true!
That was true with old days of Flash and Java Applets. There’s not much that can be done with WebGL and canvas APIs unless there’s a vulnerability with a specific browser version on a specific Intel or AMD hardware, which in this case is unlikely given the code is open source.
Compared to what?
Compared to native platforms.
Okay, I have to admit that that’s leaving me a bit nonplussed. Assume for a moment that I am concerned about the security implications of running an open-source Tomb Raider engine implementation. How exactly are you proposing running this in a more-secure fashion?
If I run an executable on my platform – say, an ELF binary on Linux – then normally that binary is going to have access to do whatever I can do. That’s a superset of what code running inside a Web browser that I’m running can do.
Are you advocating for some form of isolation? If so, what?
EDIT: And I’ve got another question for you. Let’s say that you’re worried about security of browser APIs. How do you avoid this? Because if your browser is vulnerable to some exploit in its WebGL implementation, not clicking on a link explicitly labeled as going to a website that uses 3D – which is what you appear to be urging people to do – isn’t going to avoid it. Any site you browse to – including those not labeled as such – could well expose you to that vulnerability.
EDIT2: In another comment, you say that you want to trust the “kernel” instead of the browser. Okay, fine. There are a whole class of isolation mechanisms there. What mechanism are you proposing using? Remember that you are needing to give access to your 3d hardware to whatever software package is involved here, and the Linux kernel, at least, doesn’t have a mechanism for creating virtual, restricted “child” graphics devices. The closest I can think of on Linux you can get at a kernel level there would be pass-through from a VM to a dedicated graphics adapter, which probably isn’t going to be an option for most people and I have doubts about being a carefully-hardened pathway compared to browser APIs.
Kernel sandboxing. I mean, breaking out of browser “sandboxes” is a game these days.
Which is why using the web without JavaScript is a security measurement which I strongly recommend to enable. Sure, many sites will be “less interactive” then, but I’m afraid that it is the only solution. For the usually: rather small number of websites which you absolutely need to use with JavaScript enabled (do you, really?), a separate browser inside a container (or VM) would be a good option. I admit that this is not the most comfortable setup, but I really prefer to be safe than sorry. YMMV, but you asked.
That’s a class of different mechanisms. I updated my comment above. I’ll repeat the text there:
Virtually every website out there today uses Javascript. Lemmy uses Javascript. What makes this particular website a risk?
Yeah, I do. Fifteen years ago, I used NoScript, and some things broke, but it was usable; there were enough people running non-JS-capable browsers that websites had a reasonable chance of functioning. The Web generally does not function without Javascript today.
Most of those work without it.
Lemmy is one of several ActivityPub-capable applications. You do not need to use Lemmy inside a web browser in order to participate here. In fact, you don’t even need to use a web browser.
I disagree. Some websites (with lazy developers) work less well without JavaScript. You’ll gain less annoyances (no JS = no pop-ups and no sophisticated anti-adblock techniques), more speed, less energy consumption, less potential security risks. You’ll lose… not really much. “Web applications” (usually worse, slower and less reliable than installed software), a couple of websites which are very focused on providing effects over contents - sounds like a fair deal to me, but again, YMMV.
Yes, there will never be absolute security. If it runs on a computer, it most likely has security flaws.
Is this true for PWAs for mobile too? I’m a security noob, but I’ve noticed the (few) PWAs I use on iOS seem to be using and sending a lot less telemetry stuff compared to their app counterparts. They seem faster too.
“PWAs” are still less efficient than native apps. There are many disadvantages - and one advantage (“it’s easy to make one”).