Fogftp login failed
-
When you see the boot menu there is no task for this host…
-
This post is deleted! -
All my problems are solved, thank you for everything guys thank you Uncle Frank
-
Its quite an old topic but I’m actually facing the same issue and i’ve tried to play with the space on the password but does nothing
error type:
fogFTP: login failed. host:xx.xx.xx.15, Username: fog, passwrd: mypwd, Error: ftp_login(): Please specify the passwordThe password has been change on the storage management, its the same in the tftp side aswell ( in fog setting)
I’ve also alligned this password with passwd fogI’m on fog.1.2 on centos7
it happens really at the end on the client side.
For you info I can acces ftp trhough a distant client with the fog credential also
the /images folder is on 777 recursive
slinux is fully disabled -
@kortnor Can you try to manually FTP into the fog server from a remote computer using the credentials you have inside storage management? What happens?
-
@Wayne-Workman
Hello wayne,I’ve been able to make a remote connection with ftp hereunder the extraction of the information
[root@pla pla]# ftp ftp> open 10.10.0.15 Connected to 10.10.0.15 (10.10.0.15). 220 (vsFTPd 3.0.2) Name (10.10.0.15:root): fog 331 Please specify the password. Password: 230 Login successful. Remote system type is UNIX. Using binary mode to transfer files. ftp> cd /images 250 Directory successfully changed. ftp> ls 227 Entering Passive Mode (10,10,0,15,119,3). 150 Here comes the directory listing. drwxrwxrwx 3 1001 0 41 Mar 13 16:45 dev -rw-r--r-- 1 1001 1001 2629 Mar 13 20:24 file.pcap drwxrwxrwx 2 1001 0 29 Mar 13 12:59 postdownloadscripts 226 Directory send OK. ftp> cd dev 250 Directory successfully changed. ftp> ls 227 Entering Passive Mode (10,10,0,15,139,245). 150 Here comes the directory listing. drwxrwxrwx 2 1001 0 87 Mar 13 17:06 288023fd45ae 226 Directory send OK. ftp> cd 288023fd45ae 250 Directory successfully changed. ftp> ls 227 Entering Passive Mode (10,10,0,15,207,231). 150 Here comes the directory listing. -rwxrwxrwx 1 1001 0 0 Mar 13 18:42 d1.has_grub -rwxrwxrwx 1 1001 0 1048576 Mar 13 18:42 d1.mbr -rwxrwxrwx 1 1001 0 313 Mar 13 18:42 d1.partitions -rwxrwxrwx 1 1001 0 2027153647 Mar 13 18:54 d1p1.img -rwxrwxrwx 1 1001 0 136 Mar 13 18:54 d1p2.img 226 Directory send OK.
I can see that the image has been created ( at least I think so based on the code here above)
When try to download the stored images within dev/288023fd45ae the vm retrive the related error
-
@kortnor The image currently uploaded is put in /images/dev/<mac-address> and will be moved to /images/<img-name> at the end of the upload process (probably where you see the FTP login error). You can move the directory by hand if you like (
mv /images/dev/<mac-address> /images/<img-name>
) but you’d still want to find out what’s wrong with the FTP settings! Please check that username and password of the FTP user (you were able to successfully login via FTP as we see) match the storage management password and username fog and the tftp password under the web ui in fog settings! As you see in the old post there where whitespaces cause trouble. Please double check all the settings! -
@Sebastian-Roth It’s likely caused by the autofill feature of certain browsers, would it be possible to add that to the wiki so people are aware that every time they make changes to the storage nodes that they have to enter their actual password?
-
SELinux is another thing to check, as well. And as Sebastian and Quazz mentioned, many people get bitten by their browser’s Auto-Fill functionality.
-
@Sebastian-Roth
I’ve copy the /image/<mac> to /images/<images in image management >
It appears it does something really quick on the client side, then delete the task in the task management. but he hasn’t upload the images ( in /image/ )also, I’ve checked the user within the gof ui and they are correct. I also added some space in front of the pwd and it refkect directy on the client side ( on the error shown on the screen) So i don’t think its on the ui that there is a problem
-
@kortnor Any tricky special characters in your password? Tom has done a lot to handle all sorts of characters correctly but you never know…?!?
You are saying that you use FOG 1.2.0 - is this the version number you see in the blue cloud on the web interface? If you really use 1.2.0 I am totally lost with this. Hundreds of people are using FOG 1.2.0 and the image moving via FTP should definitely work - if password/user is correct. I am wondering what else we can do to help you find out what the issue is. Maybe you set a really easy password just for testing and give it a try.
-
This post is deleted! -
@Sebastian-Roth
first, thank you Sebastian for your time
I’ve completly alligned the password everywere I guess related to the fog user.
However I still have the same issue at the end. Can somebody tell me if I’ve missed a paramater? also, I can from a distant client make a ftp connection and connect the the fog server and put get and co ( epxlained in reply earlier in this topic)
I’m kind of lostOS layer
[root@localhost ~]# echo “fog:password” | chpasswduser management
tftp server
storage management
error from client VM
-
@kortnor The only other place you would need to change the password is on the actual fog server.
You have to do this to the local user. It’d be better to do:
echo -e "password\npassword\n" | sudo passwd fog
(or very similar)I’d usually just do:
sudo passwd fog
and manually type the password. I don’t know that chpasswd will fix that because isn’t chpasswd for domains?
-
@Tom-Elliott I remoted in and was able to confirm all his FTP settings, and verify it worked.
Then I found out he has PHP 7 installed. I recommended updating to trunk, and explained the pros/cons of it. He’s going to give it a go.
-
This is really strange! Can we get a tcpdump/wireshark packet dump of the FTP connection when the errors show up after upload?? Just to make sure FOG is sending the correct user/password to the FTP daemon.
@kortnor The easiest to get a good packet dump of this is probably using tcpdump on the FOG server. Install tcpdump package (yum, dnf, apt-get, aptitude, …) and let your client do the upload. Prepare everything and as soon as you get the first ftp_login message on screen fire uptcpdump -i eth0 -w ftp_issue.pcap host 10.10.0.x
(where 10.10.0.x must be your client’s IP address!). After you have seen the error messages for five times you can stop tcpdump (Ctrl+c). Please upload the PCAP file here in the forum! -
@Sebastian-Roth I’m in the habit now of looking at what is stored in the DB for the nfsGroupMembers table. His credentials stored there worked for connecting to the fog server via FTP.
I also checked firewall and SELinux. Both were disabled.
Only thing I could see that was different and unusual was PHP 7.
That said - I don’t disagree with a packet capture.