Overcoming ghc "out of memory" errors on OpenBSD

When trying to use Cabal to install Haskell software on OpenBSD I initially encountered problems with builds. From interrogating the logs I established that a command like:

cabal v2-install hakyll

was resulting in a ghc out of memory error.

ghc: out of memory (requested 1048576 bytes)

It would seem that I was coming up against memory constraints, which is surprising as my laptop has 16GB of RAM. My first thought was the constraints which OpenBSD puts on allocation of resources in the /etc/login.conf configuration file. By default (for the default login class) on OpenBSD, processes initiated by users are restricted in terms of the system resources they can use. OpenBSD has a mechanism whereby less restrictive resource constraints can be assigned to a user assigned to the staff login class. On this basis I checked that my user was in the staff group and that the datasize-cur and database-max attributes were greater than the amount of memory that ghc was requesting. This was the case:

# Staff have fewer restrictions and can login even when nologins are set.

So why was ghc still running out of memory even though it was launched by a user in the staff group for which an ample datasize limit is specified? It is perhaps because the ghc executable is located in the /usr/local/bin directory which is listed in the path variable in default class block of login.conf. The solution to getting Hakyll to compile was to bump up the datasize-cur and datasize-max variables in login.conf for the default login class. This means that ghc can acquire the resources it needs to complete the compilation.

	:path=/usr/bin /bin /usr/sbin /sbin /usr/X11R6/bin /usr/local/bin /usr/local/sbin:\