Scenario: I wanna control / access my comp1 (which runs Linux) from my comp2 (which runs MacOS).
Reason: I cannot work with Data Science's stuff with comp2 (Mac M1) whereas comp1 has everything (Linux, GPU,...)
Two computers must be connected to the same network!
In case you have a PC/Windows which playing games and you want to play these games on these machines but from another machine, let’s try Steam Link. Read this note for more: Stream games between 2 computers in the same local network.
- Windows: Windows Remote Desktop or NoMachine. WRD has a better display and usage. It looks like the same as you work on the remote machine.
- Linux / MacOS: No Machine.
No Machine Service is only for the “server” side. If you wish to use a “client only” side, you don’t have to enable the No Machine Service on the client, just open the No Machine app on the client, that’s it.
Don’t use
nx://100.xx.xxx.x:4000
, use nx://192.168.xx.xx:4000
instead!❇️ On the "server computer" (comp1 -- Linux)
1# Knowing its name
2hostname
3# or `hostnamectl` or `cat /proc/sys/kernel/hostname`
4# mine: pop-os
5
6# Knowing current user
7whoami
8# mine: thi
9# You must know the password!!!
10
11# Install openssh-server
12sudo apt update
13sudo apt install openssh-server
14
15# Check comp1's ip
16ifconfig | grep "192.168"
17# mine: 192.168.1.115
Test: connect from comp1 to comp1 itself!
1ssh 127.0.0.1
2# type user1's password
❇️ On the "client computer" (comp2 -- MacOS)
1# Connect via comp1's name
2ssh [email protected]
3# Type thi's password
4
5# Connect via comp1's ip
6ssh [email protected]
❇️ Disconnect
1exit
❇️ Copy files
1# From client to remote
2scp /from/client/file.zip [email protected]:/on/remote/
3# (change the destination name)
4scp /on/client/file.zip [email protected]:/on/remote/file_renamed.zip
5
6# From remote to client
7scp [email protected]:/on/remote/file.zip /on/client/
Tip: You can use a smtp client (eg: CyberDuck) to make things visually
1# server
2pop-os.local # or using ip address
3# port
422
5# username
6thi
7# password
Suppose that there is a jupyter lab server which is running on comp1 (In my case, it's running inside a docker container which is ported to comp1 via port
8888
).1# On comp2
2ssh -N -L localhost:8888:127.0.0.1:8888 [email protected]
3# Remark: keep the terminal
Then open http://localhost:8888/lab to see the result!
I wanna ssh to the container which is running on comp1 from comp2.
❇️ Suppose that the running container on comp1 is created from an image which hasn't set up the open-ssh by default. We will set up a server in the running container
1# Check the name of running container
2docker ps # mine: docker_ai
3
4# Go inside the running container
5docker exec -it docker_ai bash
6
7# [in the container]
8
9# Install ssh server
10apt update && apt install openssh-server && apt install nano
11# Change `root`'s password
12passwd # suppose: qwerty
13
14nano /etc/ssh/sshd_config
15# and add
16Port 2222
17PermitRootLogin yes
18
19# Restart ssh server
20/etc/init.d/ssh start
In the
docker-compose.yml
1# expose the ports
2ports:
3 - "6789:2222"
1# Test on comp1
2ssh -p 6789 root@localhost
3# enter "qwerty" password for "root"
4
5# Connect from comp2
6ssh -p 6789 [email protected]
7# enter "qwerty" password for "root"
❇️ In case your image has already installed
openssh-server
but forgot to run it by default. We will run the ssh server on port 22
for the running container.Add below line to
Dockerfile
if you wanna run the openssh-server by default1CMD $(which sshd) -Ddp 22
Remark
We shouldn't (cannot??) run 2 servers in parallel in the docker image (for example, one for jupyter notebook on port
8888
and one for openssh-server
on port 22
).💡 In this case, you should keep the jupyter notebook running. Each time you wanna run the
openssh-server
, you can run1docker exec docker_ai $(which sshd) -Ddp 22 # and keep this tab open
2# or
3docker exec -d .... # detach mode
You can also do this completely from comp2
1ssh [email protected]
2# Then you are in comp1's terminal
3docker exec ....
Important remark: If you enter the container's shell and then you wanna exit with
exit
or logout
command. It also terminates the server and you have to run the server again!Don't forget to forward the port
22
(in container) to 6789
in comp1 via docker-compose.yml
.1# On comp1
2docker exec <container_name> $(which sshd) -Ddp 22
3# Keep this tab open and running
1# On comp2
2ssh -p 6789 [email protected]
3# enter pwd: "qwerty" as in the Dockerfile