Hi Beehaw people! New here and hope some of you will take interest in this toolkit and accompanying writeup. :3


Set up a framework to fully man-in-the-middle my own browsers’ networking and see what they’re up to beyond just looking at their DNS queries and encrypted tcp packets. We force the browser to trust our mitmproxy cacert so we can peek inside cleartext traffic and made it conveniently reproducible and extensible.

It has containers for official Firefox, its Debian version, and some other FF derivatives that market a focus on privacy or security. Might add a few more of those or do the chromium family later - if you read the thing and want more then please let us know what you want to see under the lens in a future update!

Tests were run against a basic protocol for each of them and results are aggregated at the end of the post.


Apart from testing browsers themselves it can be useful for putting extensions under the lens. Making a modern browser properly accept a proxy and trust the mitmproxy cert is a lot more obscure and fiddly than it might seem so hopefully this can be helpful in empowering and pushing other people to peek inside what’s actually going on inside their own systems without spending hours or days figuring out what actually makes it tick.

Rewritten cross-post. First Thread @ https://discuss.tchncs.de/post/53845514

  • DetachablePianist@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    1 day ago

    Thanks for sharing! A human-friendly summary of the gist of those results would be super helpful. This looks interesting, and you may have rekindled my interest in LibreWolf over BetterFox-hardened Firefox again, but I’m not entirely clear on what your results mean in human terms.

    • ken@discuss.tchncs.deOP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      24 hours ago

      Thank you for kind words!

      Ah, then the hope is that this curiosity will trigger you to dig into it yourself (for example using the provided tool or taking inspiration from it) so that it starts making sense! I know it’s an unconventional format to refrain from laying out my own opinions and analysis but that’s my thing today. So much “everyone knows” and vapid third-hand takes flying around these days that I think we would do well to actually verify (and pick up related knowledge in the process) rather than take forum comments and blog posts for gospel.


      OK, all right, I can try. I guess I can point at one thing in the Mozilla telemetry at the very end, doesn’t that look very fine-grained if you look at the URLs (addresses) listed?

      We can tell that many of the actions I took were communicated to the mothership for analysis and product improvement. Is this data really anonymized (or anonymizable)? Is it a reasonable amount for a user that has not opted in? My professional and personal opinion is: It is not.

      But! That’s just one isolated example. And an extremely limited view. What about Zen? Chrome, Edge and Safari weren’t included here at all. And it’s not at all looking at what happens for a user who probably cares about this: when you go to settings and disable all the telemetry. See I just said that one thing about Mozilla Telemetry and now I’m going to have to run some new tests and write reports about them for days just to set that record straight!

      Maybe I’m odd but I think it’s many (100?) times easier and quicker to gain understanding of the kinds of stuff we’re looking at here by getting hands-on than to communicate it verbally. And I’m concerned with this limited attention span so many people are afflicted with these days, and look at how long this comment is already, no we’re done with me tell you how it is, let’s wrap this one up and get onto the juicy stuff.


      There’s an expandable section Basic test environment usage under Testing procedure but I realize now that might be easy to miss…

      Anyway, to start it: Install podman, docker-compose (v2) and MITM_BROWSER=firefox-esr podman compose up --build. That should be it.

      Then the browser pops up (hopefully), you do your thing, and after you Ctrl+C in the console, it will quit and the proxy will dump the recorded .har file which contains all HTTP and websocket traffic that went through the proxy in cleartext, in JSON format. There’re tools online that can help visualize I think but nothing I can recommend off the bat. Simply cating it to the terminal or opening it in a text editor can be educative. Also playing around with variations of the jq snippets and see if you can come up with questions of your own to answer. Or if anything in my numbers make you scratch your head or say “wait a minute” dig there.

      In case you want to take al look at what the thing does before running it (trust me bro), these are the files involved when you run that compose up command:

      Available browser images