Security@Georgeliu.me

A mounted Box.com drive and duply are a good combination for backup data to a remote location. Duply allows for encrypted backups, and the encryption also helps to prevent bit rot. A remote drive is useful for keeping cloud-based servers free of extraneous data, which is especially useful when paying per GB per month for storage.

My problem started after I upgraded Ubuntu. Apparently, I used a the default configuration for davfs2 instead of my custom configuration, which caused a few errors. But I also learned a few things:

To use Box.com, you must have an appropriately sized cache and disable file locking in the davfs2 config.

To set this up, edit the config: sudo nano /etc/davfs2/davfs2.conf

Add these two lines anywhere (preferably under the commented out section or at the bottom):

use_locks 0
cache_size 100
  • File locking should be disabled for Box, as it is not compatible. Files can be written to on the server while they are being uploaded to Box, and vice versa, but your timed sync/backup strategy should prevent this.
  • Box.com is limited to files of around 50 MB, and duply defaults to 25 MB, so a cache of 100 MB is certainly safely over the limits.
If a mounted drive cannot be written to, the davfs2 cache will grow (very large).

In my case, it was up to around 9 GB. Check to see what’s going on with journal ctl: sudo journalctl -b | grep mount or something like journalctl -u davfs2 --since today. You might see a message like open files exceed max cache size. Then check the configuration of davfs2 to increase the cache size and see if locking is enabled, as detailed above.

The davfs2 cache can be deleted safely if a drive is unmounted.

Unmount using umount /LOCATION/NAME

If a drive is busy, it can’t be unmounted. You can force an unmount (with chance of data corruption) using umount -f or umount -l, but it’s better to identify and kill the processes using the drive:

  1. lsof | grep '/dev/sda1' (change /dev/sda1 to the mounted drive name)
  2. pkill target_process (kills busy proc. by name | kill PID | killall target_process)
  3. umount /dev/sda1

How do you set up a duply and Box.com backup? I think I wrote about it before, but if not, here are some links: 1, 2
Other resources: 1, 2 , 3, 4, 5

 

 

Continue reading

Upgrading between Ubuntu LTS releases can be a nightmare. Even with proper backups, a lot of software and configurations change, and hunting down settings that have broken is extremely time consuming. I suppose that’s why experts say it’s easier to start from scratch. I’ll try that next time.

What happened to me:

  • Not all necessary PHP7 mods were installed
  • Caching configs did not carry over, so NginX server blocks had to be rewritten. Caching is still not working.

The bulk of what I had to do was to rewrite server blocks to remove some fast-cgi caching and update the nginx configs to use php7 instead of php5:

    location ~ \.php$ {
        include snippets/fastcgi-php.conf;
    
        # With php7.0-cgi alone:
        #fastcgi_pass 127.0.0.1:9000;
        # With php7.0-fpm:
        fastcgi_pass unix:/run/php/php7.0-fpm.sock;
    } (source)

Steps

  1. Follow DO guide
  2. Upgrade kernel
  3. Confirm settings regarding PHP
  4. Install mods
sudo apt-get install php-cli php-fpm php-mysql php-curl php-gd php-mbstring php-mcrypt php-xml php-xmlrpc
sudo systemctl restart php7.0-fpm

If php-* doesn’t work, try php7.0-*

 

 

Photo by DudeOmega