hello :))

I have problems with the WiFi adapter on my new pc, and in order to troubleshoot I need to use some utils that are not on already on the computer.

is it possible to just copy the binaries from a computer with internet connection onto a usb drive and move them over that way ?

And in that case, how do I make sure to also copy all the dependencies ?

or is there a smarter way do to it all together ? 😅

I hope this is the right community for this question :)) I couldn’t find any community specifically for Linux tech support.

  • ObsidianBreaks@lemmy.ml
    link
    fedilink
    arrow-up
    12
    ·
    1 year ago

    Why not use a live ISO version of something and boot it from a USB, if you need a full set of network troubleshooting tools, the Kali Linux Everything ISO for example will definitely have everything.

  • aberrate_junior_beatnik@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 year ago

    Yeah, that should work. ldd "$(command -v "$cmd")" will list the dynamic dependencies for $cmd, so you can find those (probably) in /lib and /usr/lib; I’m not familiar enough with the dynamic library loading process to give you the specifics. I would put the binaries in /usr/local/bin and the libraries in /usr/local/lib; but you could also modify path variables to point to the usb drive. Ideally you could find statically linked versions somewhere, so you don’t have to mess with the libraries.

    Alternatively, most package managers have commands to download packages; then you can copy the package cache over to the new machine and install them that way. If the commands are common enough, you could download one of the bigger install media and add its package repo to your machine. These of course are distribution specific processes.

    Finally, you could get a cheap USB ethernet adapter and connect to the internet that way. On newegg most of these products will have at least one review saying whether they work on linux.

  • bizdelnick@lemmy.ml
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    In general, no. Better way is to download packages with that tools from your distro repository, transfer them via flash key and install. You also have to download dependencies, but CLI tools usually have few of them and there are good chances they are already installed.

  • min_fapper@iusearchlinux.fyi
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    If you have an android phone, you can plug it in via USB and enable USB Internet tethering, which will give you working internet access on your machine to do the Wi-Fi debugging with.

  • db2@sopuli.xyz
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    When I had no (useful) Internet where I was living a few years ago I would save a list of packages to download from Synaptic to a drive and then when I was somewhere I could I would download them, then when I got home I could plug in the drive and update/install them.

  • Depends on the tools. If they’re statically compiled, it should be fine. If they aren’t, it might still be fine if the distro and versions are similar. But what you want is statically compiled binaries.

    It’ll need to be the same architecture (ARM -> ARM good, AMD -> ARM bad), and check each tool on your working computer with ldd; the fewer lib dependencies, the better.

    Scripting languages are probably not worth messing with. Even if you have a running interpreter on the broken machine, scripting languages tend to lean heavily on third party libs, which may not be installed. The exception are ba/sh scripts, which have a good chance of using only commonly installed commands (why else use bash?).

  • davefischer@beehaw.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    If the package manager on your old PC is keeping copies of everything it installs, just copy all of those packages over and go through the package manager on the new PC. Look under /var/cache