Connect to remote database via SSL from UF

I’m trying to set up a separate database server and then connect to it from a separate server running User Frosting on LEMP. I was able to get it all set this up nicely following the Digitalocean tutorial at https://www.digitalocean.com/community/tutorials/how-to-set-up-a-remote-database-to-optimize-site-performance-with-mysql-on-ubuntu-16-04, but I’m struggling with the last step, which is getting my UF installation to connect to the new remote database with secure transport enabled / require_secure_transport = on.

I’m using local IP addresses (i.e. within Digital Ocean’s network) and have successfully connected from my web server via mysql on the command line and have verified that this connection is using SSL.

When I first tried this, I got this error “SQLSTATE[HY000] [3159] Connections using insecure transport are prohibited while --require_secure_transport=ON.”

I couldn’t see anything in the UF documentation about this but was able to deduce the config values that are needed for Laravel to pass to PDO_MYSQL:
‘sslmode’ => true,
‘options’ => array(
PDO::MYSQL_ATTR_SSL_KEY => ‘path to privkey.pem’,
PDO::MYSQL_ATTR_SSL_CERT => ‘path to fullchain.pem’,
PDO::MYSQL_ATTR_SSL_CA => '‘path to ca.pem’
)

Here are the things I’ve attempted (without success!) to do to fix this:

  1. I tried referencing the paths to the SSL certificate on my web server. The CA cert that Certbot provides is apparently called chain.pem. This gave me the error ‘PDOException
    Message: failed loading cafile stream: …path’ (the path did exist)

  2. I looked file permissions on those files and tried referencing the unsymlinked files directly. I got: PDO::__construct(): SSL operation failed with code 1. OpenSSL Error messages: error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed

  3. I read another article at https://www.digitalocean.com/community/tutorials/how-to-configure-ssl-tls-for-mysql-on-ubuntu-16-04 that made me think (realize?) that the certificate files I need are the ones generated by mysql on the database server, not the ones on the client server. So I copied these, but got This gave me the error ‘PDOException
    Message: failed loading cafile stream: …path’ . I then referenced them in my mysqld.cnf file under the [client] directive, but when I restarted I got “ERROR 2026 (HY000): SSL connection error: SSL is required but the server doesn’t support it”. From what I could find, this implies a path error but the path seemed to be correct.

I’m sure I’m doing something daft here (as you’ll have gathered, I’m somewhat out of my comfort zone), but I wondered if anyone could tell me:

  • is this the right approach overall to doing this in UF or is there another method I’m unaware of?
  • is it the certificate files from the database server that I need to be referencing?
  • what should the permissions on those files be?

Hope this is clear. Thanks for any help!

An update on this:

  • [edited] - I was having trouble persuading UF to use the config in my site sprinkle (I could only get it to change its DB connection by overwriting the core sprinkle default.php). I had failed to notice that I had another ‘db’ entry to change the charset lower down in the same config file - so that was overwriting the other one)
  • As for the other issue - it doesn’t have anything to do with UF or Laravel - as far as I can tell it has to do with how PHP’s PDO extension works. I’ve written up what I’ve discovered about it below for reference.

Connecting from PHP to MySQL on a remote server via SSL:

  • The PDO method in PHP requires that the CA name matches the host name. I’m using an internal IP to connect, so it doesn’t match. Therefore the verification fails.
  • I generated the certificates on the database server using mysql_ssl_rsa_setup
  • mysql_ssl_rsa_setup doesn’t let you set a name for the CA cert (see https://bugs.mysql.com/bug.php?id=84726 and https://dev.mysql.com/worklog/task/?id=7699)
  • When mySQL generated the certificates, it did so using its own CA name:
    “MySQL_Server__
    Auto_Generated_[CA|Server|Client]_Certificate”
  • I ran this command locally to confirm the CA name: openssl x509 -in ca.pem -noout -text
  • I set up a host entry on my client server so that I could connect using a name that matched the CA name precisely (in this case that was MySQL_Server_5.7.22_Auto_Generated_CA_Certificate)
  • But it still didn’t work! The only way I can connect is by using the flag MYSQL_ATTR_SSL_VERIFY_SERVER_CERT and setting it to false. I’m not clear on how much of a security risk this is (but I also don’t know of another way around it!).

I’ve never tried setting up a remote database server, so on behalf of the community, thanks for looking into this.

I will say though - are you sure you need to remotely access the database directly?

Often times, the best solution to a distributed architecture is to expose an HTTP API via an instance of UserFrosting (or some other application) on the same server that your database is installed on. Then other servers can communicate with your database via HTTP, rather than on the database level.

You could use JWT for your access control mechanism.

Alex - thanks for this reply. Am I sure? Not really, but it seemed to make sense to me (and one month in, I haven’t had any issues). My application is almost entirely client-side, with the main server interaction being via database (some reads, but mostly updates being sent via AJAX). It has definitely been handy being able to easily configure my staging site (on its own instance) to talk to the live database. And I think it will help with deployment. But obviously I’ve mostly gone down this road because it feels safe and familiar, not because I know it to be the best one! I’ll definitely take a look at JWT for future reference.