Skip to content

12 - Networking Essentials

What this session is

About 45 minutes. You'll learn the network commands every Linux user eventually needs - fetch URLs, SSH to remote machines, copy files, see what's listening on ports.

Fetch a URL: curl

curl https://example.com                     # print to terminal
curl -o page.html https://example.com        # save to file
curl -I https://example.com                  # HEAD request (headers only)
curl -L https://bit.ly/something             # follow redirects
curl -X POST -d "name=alice" https://api.example.com    # POST request

curl is the universal HTTP client. Read its man page once; the flag inventory is huge but you'll use 5-10 of them regularly.

For JSON APIs:

curl -s https://api.github.com/users/octocat | jq

jq is a JSON processor. Install: sudo apt install jq / brew install jq. Pretty-prints and filters JSON. Pair with curl constantly.

wget is a simpler alternative for "just download this":

wget https://example.com/file.zip

ssh: log into remote machines

ssh user@host                # log in
ssh user@host "command"      # run one command and exit
ssh -p 2222 user@host        # custom port (default 22)

The remote shell prompt is yours. Whatever you type runs on the remote machine.

First time connecting to a host: SSH asks you to verify the host's fingerprint. Say yes (after, ideally, verifying out-of-band). The fingerprint is stored in ~/.ssh/known_hosts.

SSH keys: passwordless login

Type a password every time? Use a key pair instead.

Generate:

ssh-keygen -t ed25519 -C "your_email@example.com"

Saves ~/.ssh/id_ed25519 (private - keep secret, never share) and ~/.ssh/id_ed25519.pub (public - fine to share).

Copy your public key to the remote:

ssh-copy-id user@host

After: ssh user@host logs you in without a password.

Permissions matter: - ~/.ssh must be 700. - ~/.ssh/id_* private keys must be 600. - ~/.ssh/id_*.pub public keys can be 644.

Wrong permissions and SSH refuses to use the keys.

Copy files: scp and rsync

scp (secure copy):

scp file.txt user@host:/path/to/dest/        # local to remote
scp user@host:/remote/file.txt local-name    # remote to local
scp -r mydir user@host:/dest/                # recursive (directories)

rsync is much smarter - incremental, resumable, efficient over slow links:

rsync -avh source/ user@host:/dest/          # sync directory contents
rsync -avh --delete src/ dest/               # also delete dest files not in src
rsync -avh --dry-run src/ dest/              # show what WOULD change

-a = archive (preserves permissions, recursion, etc.), -v = verbose, -h = human-readable sizes.

The trailing / on the source matters: - rsync src/ dest/ - copy contents of src into dest. - rsync src dest/ - copy src itself into dest (as dest/src).

Use rsync for everything except trivial single-file copies.

What's listening on what port: ss

ss -tlnp                # TCP, Listening, Numeric, Process info
ss -tunlp               # also UDP

Shows which programs are listening on which ports.

Older command: netstat -tlnp. Same idea, deprecated in favor of ss.

sudo ss -tlnp           # needs sudo to show process info for other users' processes

What process owns a port: lsof -i

sudo lsof -i :8080      # what's on port 8080
sudo lsof -i tcp        # all TCP usage

Useful when "port already in use" - lsof tells you who's holding it.

DNS lookup: dig and nslookup

dig example.com
dig +short example.com           # just the IP(s)
dig example.com MX               # mail exchanger records
nslookup example.com             # older alternative

dig is the modern, scriptable tool. nslookup is older and still around.

Ping and traceroute

ping example.com                 # send ICMP echo; press Ctrl-C to stop
traceroute example.com           # show the route packets take

Useful for "is this host reachable?" and "where does the path break?"

On modern systems some of these are restricted; use mtr (combo of ping + traceroute, interactive) if installed.

Firewall: ufw (Ubuntu)

sudo ufw status                  # what rules exist
sudo ufw allow 22/tcp            # allow SSH
sudo ufw allow http              # allow HTTP (port 80)
sudo ufw enable                  # turn on the firewall
sudo ufw deny 23                 # block telnet

Beyond beginner; mentioned for awareness. Most desktop users don't manage their firewall manually.

A real session: SSH into a server, sync a directory

# One-time setup: generate key, copy to remote
ssh-keygen -t ed25519
ssh-copy-id alice@my-server.example.com

# Now SSH passwordless
ssh alice@my-server.example.com
# ... do stuff on remote ...
exit

# Sync a local dir to the server
rsync -avh --delete ~/projects/myapp/ alice@my-server.example.com:/srv/myapp/

# Or fetch a file from the server
scp alice@my-server.example.com:/var/log/app.log ./

A few useful patterns

Test a webhook endpoint:

curl -X POST https://example.com/webhook \
  -H "Content-Type: application/json" \
  -d '{"event":"test"}'

Download a tar archive and extract:

curl -L https://example.com/foo.tar.gz | tar -xz

The tar -xz extracts a gzipped tar from stdin.

Stream output from a remote command:

ssh user@host "tail -f /var/log/app.log"

tail -f on the remote, output streams to your local terminal.

Exercise

  1. Fetch a URL:

    curl -s https://api.github.com/users/octocat
    
    Then pipe to jq if installed:
    curl -s https://api.github.com/users/octocat | jq
    

  2. DNS lookup:

    dig +short github.com
    

  3. See what's listening on your machine:

    ss -tlnp 2>/dev/null
    
    What ports does your computer expose?

  4. Generate an SSH key (if you don't have one):

    ls ~/.ssh/                              # check first
    ssh-keygen -t ed25519                   # if no id_ed25519 exists
    cat ~/.ssh/id_ed25519.pub               # your public key
    
    Copy the public key - you'll need it for GitHub (page 15) and any servers.

  5. Add your key to GitHub: GitHub Settings → SSH and GPG keys → New SSH key → paste your public key. After: ssh -T git@github.com should respond with your username.

  6. Bonus - rsync a folder to itself with --dry-run to see what would change:

    rsync -avh --dry-run --delete src/ dest/
    
    Useful before destructive syncs.

What you might wonder

"What's tmux for in this context?" SSH sessions die when your local connection drops. Run things inside tmux on the remote and they survive - reconnect with tmux attach. Indispensable for any remote work.

"What about nc (netcat)?" Low-level "make/accept TCP connections, send/receive bytes." Useful for testing services, transferring files when other tools aren't available. Niche but powerful.

"How do I serve a local directory over HTTP for quick sharing?"

python3 -m http.server 8000
Serves the current directory on port 8000. Open http://localhost:8000 in a browser. Great for sharing files on a LAN or testing.

"VPN, proxies, tunnels?" SSH itself can do port forwarding (ssh -L 8080:dest:80 user@host creates a tunnel). Beyond beginner; useful to know exists.

Done

  • Fetch URLs with curl (and maybe wget).
  • SSH to remote machines, with keys for passwordless.
  • Copy files with scp and rsync.
  • See listening ports with ss.
  • Look up DNS with dig.

You've now covered the core CLI skills. Remaining pages: how to apply this to OSS contribution.

Next: Picking a project →

Comments