Currently I'm doing it by SSHing into a server, and executing Vim on the server. This has the benefit of not having to deal with cumbersome syntax of opening files from a remote server over SCP, and, more importantly, being able to really quickly navigate the server's filesystem.
i am trying to get a list of files to be scped from the remote server
by running the below in my local unix server ( note - there is a passwordless connectivity setup made between the local and remote server) and, we use KSH.
ssh $scp_host "find /a/b/c/*/ -iname "$remote_file"" > list.dat
the above pulls all the files that matches the pattern fro
This may be the most ridiculous question I've ever asked, but I freely admit that I am stumped. I'm trying to move some files from a remote server to my local machine. I'm running Ubuntu, obviously. Now, just for background this machine has two different users. I'm logged in as one of them; joe.
I am redesigning a network as such: One Server 2012 essentials domain controller and three backup servers spread over three geographical locations (one of which will be with the DC).
For the purpose of redundancy, I want to have files put directly onto the DC, then pushed over to a FreeNAS box, then everything on the FreeNAS box to be replicated over a WAN link to the other two locations.
I am simply not a programming guy. I deal with networks. I need a script to copy files from a remote server(Linux) to another machine.I have only ftp access to the remote server. Files are generated daily but fetched twice a week. Can anyone help me out in this ????
In my Redhat server , I know there are some data is writing to the server , these data is coming from remote server , but I do not know which files ( or in which directory ) is writing , could advise how can I know what files is updating in my server ?