What an interesting past 24 hours this has been. I've travelled down through the depths of frustration and back up to the summits of triumphant success. And in the process, I've made some really tremendous improvements to my site that have significantly improved it scalability and functionality. As I mentioned yesterday, I've been trying to transfer the hosting of my sites static and media files to the S3 app of Amazon Web Services for a number of reasons. Mainly, doing so would reduce the loading times for my site, allow users to upload their own images such as profile pictures in the future, and avoid the shortcomings of Heroku's ephemeral file system. Initially, I was trying to follow some documentation that outlined how to connect Docker with both Django and AWS S3. As I had no prior experience in Docker, or remote file hosting in general, I found myself feeling frustrated, stuck, and overwhelmed. It was just too much all at once. After taking a few moments to step back and reflect on the situation, I realized that I was trying to wrangle a massively powerful tool (ie Docker) to address a relatively simple problem. The analogy that came to mind was trying to slice bread with a chainsaw. In other words, I needed to think simpler (who knew 'simpler' was grammatically correct?). The tutorial that finally broke through to me was [this][1] piece by Mani Batra over on Medium. I knew that I needed to somehow integrate my AWS user settings into my settings.py file to point my static and media urls to the bucket I created on S3. After so many hours of frustration and failure, this tutorial finally got me on the right track, and I'd highly recommend it to others struggling to deploy their static files to a remote service. One of the biggest lessons that I took away from this process was how to establish environmental variables and call them in my source code. Especially when it comes to web applications, you want to ensure that sensitive information such as secret keys and user ids don't find their way into the code you push onto open source sites such as GitHub. One of the easiest ways to secure this information is to establish environmental variables in your operating system's .bash_profile document. Since I'm working on a Mac, this was relatively easy to find and modify, so I can't speak to this process on Windows. At any rate, once you've saved these sensitive pieces of information under variable names, all you need to do is simply import these names to their proper places in settings.py file. This is what my code looked like: ![Source code of settings.py][2] See what I did there?! That's right, I can finally add images to my blog posts now! Thanks to my handy new S3 bucket, I'll have a dependable remote source and will no be able to take advantage of the 'insert image' feature of my markdown editor. There are even more exciting improvements that I've been making, but I think I'll wait to recap them until tomorrow. This post is already in danger of becoming too long. So instead, please enjoy this image I snapped on my day hike to Brown's Lake during my recent trip to Colorado. It's one of my favorite places in the whole world. ![Picture of mountains rising above Brown's Lake, Colorado][3] [1]: https://medium.com/@manibatra23/setting-up-amazon-s3-bucket-for-serving-django-static-and-media-files-3e781ab325d5 [2]: https://pickert-website-static.s3.us-east-2.amazonaws.com/blog/10-7-19/Screen+Shot+2019-07-10+at+1.55.09+PM.png [3]: https://pickert-website-static.s3.us-east-2.amazonaws.com/blog/10-7-19/IMG_7483-min.JPG