Skip to main content

Enabling SSL in MAMP

  • Open /Applications/MAMP/conf/apache/httpd.conf, Add the line
LoadModule ssl_module modules/mod_ssl.so
         or uncomment this out if already in the conf file
  • Also add lines to listen on 80, if not there already
Listen 80 
ServerName localhost:80
  • Open /Applications/MAMP/conf/apache/ssl.conf. Remove all lines as well as . Find the line defining SSLCertificateFile and SSLCertificateKeyFile, set it to
SSLCertificateFile /Applications/MAMP/conf/apache/ssl/server.crt SSLCertificateKeyFile /Applications/MAMP/conf/apache/ssl/server.key
  • Create a new folder /Applications/MAMP/conf/apache/ssl. Drop into the terminal and navigate to the new folder
cd /Applications/MAMP/conf/apache/ssl
  • Create a private key, giving a password
openssl genrsa -des3 -out server.key 1024
  • Remove the password
cp server.key server-pw.key openssl rsa -in server-pw.key -out server.key
  • Create a certificate signing request, pressing return for default values
openssl req -new -key server.key -out server.csr
  • Create a certificate
openssl x509 -req -days 365 -in server.csr -signkey server.key -out server.crt 
  • Add the following virtual host definition in extra/httpd-ssl.conf (or wherever the ssl conf file is)
<VirtualHost localhost:443>
DocumentRoot /Users/myname/Documents/DevProjects/WebdevProjects ServerName localhost
SSLEngine on
SSLCertificateFile /Applications/MAMP/conf/ssl/server.crt SSLCertificateKeyFile /Applications/MAMP/conf/ssl/server.key
</VirtualHost>
  • Make sure we have the server listening in 443 for SSL, in the ssl config file
    Listen 443
  • Restart your server. If you encounter any problems check the system log file. The first time you visit https://localhost/ you will be asked to accept the certificate.

Comments

Post a Comment

Popular posts from this blog

Learning Spark Streaming #1

I have been doing a lot of Spark in the past few months, and of late, have taken a keen interest in Spark Streaming. In a series of posts, I intend to cover a lot of details about Spark streaming and even other stream processing systems in general, either presenting technical arguments/critiques, with any micro benchmarks as needed.

Some high level description of Spark Streaming (as of 1.4),  most of which you can find in the programming guide.  At a high level, Spark streaming is simply a spark job run on very small increments of input data (i.e micro batch), every 't' seconds, where t can be as low as 1 second.

As with any stream processing system, there are three big aspects to the framework itself.


Ingesting the data streams : This is accomplished via DStreams, which you can think of effectively as a thin wrapper around an input source such as Kafka/HDFS which knows how to read the next N entries from the input.The receiver based approach is a little complicated IMHO, and …

Thoughts On Adding Spatial Indexing to Voldemort

This weekend, I set out to explore something that has always been a daemon running at the back of my head. What would it mean to add Spatial Indexing support to Voldemort, given that Voldemort supports a pluggable storage layer.. Would it fit well with the existing Voldemort server architecture? Or would it create a frankenstein freak show where two systems essentially exist side by side under one codebase... Let's explore..

Basic Idea The 50000 ft blueprint goes like this.

Implement a new Storage Engine on top Postgres sql (Sorry innoDB, you don't have true spatial indexes yet and Postgres is kick ass)Implement a new smart partitioning layer that maps a given geolocation to a subset of servers in the cluster (There are a few ways to do this. But this needs to be done to get an efficient solution. I don't believe in naive spraying of results to all servers)Support "geolocation" as a new standard key serializer type in Voldemort. The values will still be  opaque b…