For developers and architects, the last few years have been truly amazing! I thought we reached a peak when I was able to use complex technologies—including a clustered database, queues, and topics—to build an event-based application over a weekend. Though the logic wasn’t complex, it was amazing to setup that infrastructure with just a right-click. Poof, it’s deployed to the cloud! Then I got the bill. For a sandbox environment, it was more than I wanted—or expected—to pay. Could I have written a Terraform or Ansible script to set it up and tear it down daily? Maybe, but I was just trying to validate a concept. Recently, I had a similar need to design a solution for an event-based system, but this time the solution needed to support on-premise and cloud.
By now, I have deployed several simple apps to my local Docker container, but these are all straightforward apps, based on examples from labs/courses I’ve taken. But not this time. This time there was no blueprint to follow. Because the blueprint was what I needed to create.
Getting started was easy: “docker pull <technology of choice>.” Within minutes, I had mongo, rabbitmq, kafka, sql, and a few other considered technologies up and running. My first lesson was to be sure I had my ports mapped and exposed externally.
$> docker run --name ocr-mongo -d -p 27017:27017 ocr.mongo $> docker run --link ocr-mongo:mongo -p 8081:8081 mongo-express
Note: You’ll soon need to update mongo-express to use environment variables instead of link, which has been deprecated.
By exposing ports externally, I was able to run code outside the container either from the command line or from my IDE of choice and connect to the product in the container (sql, rabbit, etc). In minutes I had a few dotnet core console apps publishing events and a node script listening and storing queued messages into mongo. I tested and compared kafka with rabbitMQ to see which I could connect to, as well as to determine which one met my needs around messaging. As a bonus, I didn’t even need to be connected to the internet. I could work from anywhere!
As the API and functionality solidified, I began moving my code from running in the IDE to running in containers. I was surprised at how easy it was to modify existing base Dockerfiles to get my node and dotnet apps running in containers. The most complicated part of the process was compiling the dotnet app, then the copy steps required to move the assets into another container to be deployed (see simple example below).
# Stage 1
FROM microsoft/aspnetcore-build AS builder # restore & publish …
# Stage 2
FROM microsoft/aspnetcore WORKDIR /app COPY --from=builder /app .
While I realize this isn’t a traditional use of Docker, it’s the one that worked for me. Further, it revealed to me the value and power of containers. Now that the application is working locally in my personal cloud, I’ll deploy it to the public cloud and test scaling the application using options like OpenShift and Azure Container Service (AKS).
So stay tuned for those results! And to learn more about deploying or automating containers on-premise or in the cloud, please check out our Cloud & Hybrid IT offerings, or contact me directly at [email protected].
Managing Director, Presales
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
|cookielawinfo-checbox-analytics||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".|
|cookielawinfo-checbox-functional||11 months||The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".|
|cookielawinfo-checbox-others||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.|
|cookielawinfo-checkbox-necessary||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".|
|cookielawinfo-checkbox-performance||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".|
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.