I have a server with an 17tb scsi-storage. In past, the storage has a "jfs"-filesystem. Now i want to create a "ext4"-filesystem. I have update the e2fsprogs from 1.41 to 1.42 (16tb limit >1.41).
Now I have an 17tb-storage as /dev/sda1 with ext4. I can mount this device as /home/ (/etc/fstab /dev/sda1 /home/ ext4 defaults 1 2". Now I start a e2fschk /dev/sda1 (umounted). No error-messages are in
the screen.
If I now reboot the server, the server does not started:
I think thats can be a problem with the e2fsprogs 1.42, now i reinstall the server with the default e2fsprogs 1.41 from CentOS 6.4 and create only a 16tb /dev/sda1 partition with ext4. But if i start a "e2fschk
/dev/sda1" and reboot the server, i have the same message in the boot-screen and the server does not boot.
Why the system thinks that the device still be in use? How can i change this?