Running your own mail and groupware server can be challenging. I recently had to re-create my own setup from the ground and describe the steps in a blog post series. This blog post is #4 of the series and covers the integration of Nextcloud for storage.
Let’s add Nextcloud to the existing mail server. This part will focus on setting it up and configuring it in basic terms. Groupware and webmail will come in a later post! If you are new to this series, don’t forget to read part 1: what, why, how?, and all about the mail server setup itself in the second post, part 2: initial mail server setup. We also added a Git server in part 3: Git server.
Nextcloud as “cloud” server
Today’s online experience does not only cover mails and other groupware functions these days, but also the interaction with files in some online storage. Arguably, for many this is these days sometimes more important than e-mail systems.
Thus adding a service to the mail server providing a “cloud” experience around file management makes sense. The result is lacking cloud functionality in terms of high availability, but provides a rich UI, accessibility from all kinds of devices and integration in various services. It also offers the option to extend the functions further.
Nextcloud is probably the best known solution for self-hosted cloud solutions, and is also used large scale by universities, governments and companies. I also picked it because I had past experience with it and it offers some integrations and add-ons I really like and depend on.
Alternatives worth checking out are owncloud, Seafile and Pydio.
Integration into mailu setup
Nextcloud can be added to an existing mailu setup in three steps:
- Let Nginx know about the service
- Add a DB and set it up
- Add Nextcloud
The proxy bit is easily done by creating the file /data/mailu/overrides/nginx/nc.conf
with the following content:
location /nc/ {
add_header Front-End-Https on;
proxy_buffering off;
fastcgi_request_buffering off;
proxy_pass http://nc/;
client_max_body_size 0;
}
We also need a DB. Add this to docker-compose.yml
:
# Nextcloud
ncpostgresql:
image: postgres:12
restart: always
environment:
POSTGRES_PASSWORD: ...
volumes:
- /data/ncpostgresql:/var/lib/postgresql/data
Make sure to add a proper password here! Next, we have to bring the environment down and up again to add the DB container, and then access the DB and create the right users and database with corresponding privileges:
- Get the DB up & running:
docker compose down
anddocker compose up
- access DB container:
sudo docker exec -it mailu_ncpostgresql_1 /bin/bash
- become super user:
su - postgres
- add user nextcloud, add proper password here:
create user nextcloud with password '...';
- add nextcloud database:
CREATE DATABASE nextcloud TEMPLATE template0 ENCODING 'UNICODE';
- change database owner to user nextcloud:
ALTER DATABASE nextcloud OWNER TO nextcloud;
- grant all privileges to nextcloud:
GRANT ALL PRIVILEGES ON DATABASE nextcloud TO nextcloud;
Now we can add the Nextcloud container itself. We will add a few environment variables to properly configure the DB access and the initial admin account. Add the following listing to the Docker Compose file:
nc:
image: nextcloud:apache
restart: always
environment:
POSTGRES_HOST: ncpostgresql
POSTGRES_USER: nextcloud
POSTGRES_PASSWORD: ....
POSTGRES_DB: nextcloud
NEXTCLOUD_ADMIN_USER: admin
NEXTCLOUD_ADMIN_PASSWORD: ...
NEXTCLOUD_TRUSTED_DOMAINS: front
REDIS_HOST: redis
depends_on:
- resolver
- ncpostgresql
dns:
- 192.168.203.254
volumes:
- /data/nc/main:/var/www/html
- /data/nc/custom_apps:/var/www/html/custom_apps
- /data/nc/data:/var/www/html/data
- /data/nc/config:/var/www/html/config
- /data/nc/zzz_upload_php.ini:/usr/local/etc/php/conf.d/zzz_upload_php.ini
Nextcloud configuration
Before we launch Nextcloud, we need to configure it properly. As shown in the last line in the previous example, a specific file is needed to define the values for PHP file upload sizes. This is only needed in corner cases (browsers split up files during upload automatically these days), but can help sometimes. Create the file /data/nc/zzz_upload_php.ini
:
upload_max_filesize=2G
post_max_size=2G
memory_limit=4G
Next, we need to create the configuration for the actual Nextcloud instance. Stop the Docker Compose setup, and start it up again. That generates the basic config files on the disk, and you can access/data/nc/config/config.php
and adjust the following variables (others are left intact):
'overwritewebroot' => '/nc',
'overwritehost' => 'nc.bayz.de',
'overwriteprotocol' => 'https',
'trusted_domains' =>
array (
0 => 'lisa.bayz.de',
1 => 'front',
2 => 'mailu_front_1.mailu_default',
3 => 'nc.bayz.de',
),
After another Docker Compose down and up, the instance should be all good! If the admin password need to be reset, access the container via sudo docker exec -it mailu_nc_1 /bin/bash
and reset the password with: su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ user:resetpassword admin"
Next we can connect Nextcloud to the mailu IMAP server to use it for authentication. First we install the app “External user authentication” from the developers section. Next we add the following code to the above mentioned config.php
:
'user_backends' => array(
array(
'class' => 'OC_User_IMAP',
'arguments' => array(
'imap', 143, 'null', 'bayz.de', true, false
),
),
),
Restart the setup, and a login as user should be possible.
Sync existing files
In my case the instance was following a previous one. As part of the migration, a lot of “old” data had to be copied. The problem: copying the data for example via webdav is time consuming, does not perform and might be troublesome when the sync needs to be picked up after interruption again.
It is easier to sync direct from disc to disc with established tools like rsync. However, Nextcloud does not know that new files arrived that way and does not list them. The steps to make Nextcloud aware of those are:
- Log in as each user for which data should be synced so that target directories exist underneath the
files/
directory - Sync data with rsync or other tool of choice
- Correct permissions:
chown -R ...:... files/
- Access container:
sudo docker exec -it mailu_nc_1 /bin/bash
- Trigger file scan in Nextcloud:
su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ files:scan --all"
Recurrent updater
For add-ons like the newsreader, Nextcloud needs to perform tasks on a regular base. Surprisingly enough, Nextcloud cannot easily do this on its own. The best way is to add a cron job to do that. And the best way to do that is a Systemd timer.
So first we add the service to be triggered regularly. On the host itself (not inside the container) create the file /etc/systemd/system/nextcloudcron.service
:
[Unit]
Description=Nextcloud cron.php job
[Service]
ExecStart=/usr/bin/docker exec mailu_nc_1 su -s /bin/sh www-data -c "/usr/local/bin/php -f /var/www/html/cron.php"
Then, create the timer via the file /etc/systemd/system/nextcloudcron.timer
:
[Unit]
Description=Run Nextcloud cron.php every 5 minutes
[Timer]
OnBootSec=5min
OnUnitActiveSec=5min
Unit=nextcloudcron.service
[Install]
WantedBy=timers.target
Enable the timer: systemctl enable --now nextcloudcron.timer
. And it is done. And is way more flexible and usable and maintainable than the old cron jobs. If you are new to timers, check their execution with sudo systemctl list-timers
.
DB performance
A lot of Nextcloud’s performance depends on the performance of the DB. And DBs are all about indices. There are a few commands which can help with that – and which are recommended on the self check inside Nextcloud anyway:
access container: sudo docker exec -it mailu_nc_1 /bin/bash
add missing indices: su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ db:add-missing-indices" www-data
convert filecache: su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ db:convert-filecache-bigint" www-data
add missing columns: su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ db:add-missing-columns" www-data
Preview generator – fast image loading
The major clients used to access Nextcloud will probably be the Android client and a web browser. However, scrolling through galleries full of images is a pain: it takes ages until all the previews are loaded. Sometimes even a slide show is not possible because it all just takes too long.
This is because the images are not downloaded in real size (that would take too long), instead previews of the size required in that moment are generated live (still takes long, but not that long).
To make this all faster, one idea is to pre-generate the previews! To do so, we install the app “Preview Generator” in our instance. However, this generates a bit too many preview files, many in sizes which are hardly ever used. So we need to alter the sizes to be generated:
$ sudo docker exec -it mailu_nc_1 /bin/bash
$ su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ config:app:set previewgenerator squareSizes --value='256 1024'"
$ su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ config:app:set previewgenerator widthSizes --value='384 2048'"
$ su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ config:app:set previewgenerator heightSizes --value='256 2048'"
Also we want to limit the preview sizes to not waste too much storage:
$ su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ config:system:set preview_max_x --value 2048"
$ su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ config:system:set preview_max_y --value 2048"
$ su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ config:system:set jpeg_quality --value 80"
$ su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ config:app:set preview jpeg_quality --value='80'"
Last but not least we run the preview generator:
su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ preview:generate-all -vvv"
Note that this can easily take hours, and thus I recommend to launch this via a tmux shell.
Of course new files will reach the system, so once in a while new previews should be generated. Use this command:
su - www-data -s /bin/bash -c "/usr/local/bin/php -f /var/www/html/occ preview:pre-generate"
This can also be added to a cron job similar to the timer above. Create the file /etc/systemd/system/nextcloudpreview.service
:
[Unit]
Description=Nextcloud image preview generator job
[Service]
ExecStart=/usr/bin/docker exec mailu_nc_1 su -s /bin/sh www-data -c "/usr/local/bin/php -f /var/www/html/occ preview:pre-generate"
And add a timer similar to the above one triggering the service every 15 minutes. Create the file /etc/systemd/system/nextcloudpreview.timer
:
[Unit]
Description=Run Nextcloud preview generator every 15 minutes
[Timer]
OnBootSec=15min
OnUnitActiveSec=15min
Unit=nextcloudpreview.service
[Install]
WantedBy=timers.target
Launch the timer: sudo systemctl enable --now nextcloudpreview.timer
One final word of caution: previews can take up a lot of space. Like A LOT. Expect maybe an additional 20% of storage needed for your images.
What’s next?
With Nextcloud up and running and all old data synced I was feeling good: all basic infrastructure services were running again. People could access all their stuff with only slight adjustments to their URLs.
The missing piece now was webmail and general groupware functionality. This will be covered in another post of this series.
More about that in the next post.
Image by RÜŞTÜ BOZKUŞ from Pixabay