Bret Fisher is a widely-respected Docker authority, instructor and administrator with a deep knowledge of both the Docker platform and the underlying Docker story. We dig in to learn more.
The March episode of the localhost podcast, which covers hot topics in web development hosted by Mark Drew and Rob Dudley, will feature Bret as a special guest. They'll talk about the current state of containerization as well as where it's headed from here.
For 25 years Bret has built and operated distributed systems as a Sysadmin, and helped over 30,000 people learn dev and ops topics. He is the author of the wildly popular Docker Mastery course on Udemy, and also provides DevOps style consulting and live workshops with a focus on immutable infrastructures, containers, and orchestration.
As part of getting to know Bret a little better, and in turn making the introduction to the Mura community, we had the opportunity to throw him some questions, and receive his thoughtful replies. The conversation follows:
Hi Bret! Easy first question: surfing or Docker, which one are you better at?
Docker, for sure! After living near the ocean for 15 years, I've just recently started to learn surfing, and I'm still on my "softee" board but it's loads of fun, even on the small waves.
Say you're at a conference, in an elevator filled with developers who still craft their environments by hand … what would your 10-second Docker pitch look like?
Building your apps in Docker and running everything as a container means your environments are more reproducible and replaceable. When you're fully using the Docker toolset (or its third-party equivalents) everything from running local dev environments to updating apps in production gets easier. The bonus is that it's usually easier to learn and implement containers than traditional automation and deployment tools, especially with my popular Docker Mastery and Swarm Mastery courses.
(Editor's note: use promo code "MURACON18" for a very special discount on the Docker course!)
Ok, the elevator's on its way back down now. These guys are all doing virtualization but not sure why they should learn yet another way of doing things. What's the Docker elevator pitch for these guys?
Bret's First Law Of Virtualization: However many servers you have today, tomorrow you'll have those plus one. The one-two punch is that we're also being asked to manage more with less.
That "vm sprawl" is largely because we've learned over the last 15 years to make the virtual machine our isolation layer for each app. To keep things isolated and manageable, we put the web front-end on one server, the API or cron jobs on another, and database on its own. We often had to create complex tooling with Puppet, Ansible, etc. to manage all those systems and environments.
With Docker, if your apps were "Dockerized" into their own container images, the running container becomes that isolation layer rather than the VM, and many containers can run on a single virtual machine. You get much higher utilization out of the servers you're already paying for. Add the bonus things like easier VM replacement, easier upgrades, easier rollbacks, easier testing, and the list of benefits gets so long it's hard to argue against.
The indentation hill you would die upon: tabs or spaces?
Spaces. I have my tab key auto-convert to spaces, I'm not a savage :P
In your opinion, what's the biggest misconception of what Docker actually is?
Docker is no longer just a "container runtime." It's now a full platform of tools that encompass the "build, ship, run" lifecycle of server apps. Because of that, it's as much a game-changer for operations and sysadmins as it is for developers.
A lot of the time the anti-new-thing argument I hear around technologies like Docker is, "I'm just a lone developer/small shop, I don't need something like that." Is there a place for Docker in this segment?
Sure, if not now, later. Containers aren't a fad, they are the next evolution of how we install, deploy, and update software, which has been continually improving automation since the days of "un-tar from magnetic tape." There will be a day soon when it becomes the default to provide your software as a container image. Think of Docker Hub (and others) as the new package manager that's platform and language-agnostic. You learn these tools once; and, every app and database on Linux, Windows, Raspberry Pi, and even mainframe uses the same techniques and commands.
Soon, for people to use your product, to expect them to hand-install dependencies and set up environment variables will seem old fashioned and laborious. This concept shift already happened in mobile apps, and it's increasingly true with desktop apps, where the user now adds or removes your product in a single action. It suddenly seems silly to think they'd have to do a manual install, copy-paste of files, etc. Containers are now leading that shift for servers, where the sysadmins are your product's "users." Once they are deploying containers with a single action and little effort, the old way will seem quite painful and unnecessary.
I see this shift happening everywhere. It's not just with distributed web apps in the cloud. It's impacting little things that were previously too complex for their users to configure. A fun project for me this year was helping "Dockerize" a medical modeling and simulation solution that traditionally required thousands of dollars in setup consulting per environment. In our first beta release of the open source container images, you can run the full suite on your laptop or a cloud server with a single Docker command, letting scientists do modeling without any IT personnel involved. Previously they'd often have to get funding just for the expertise to get it working. That's the power of containers!!
The novel you tell everybody you're reading, and the novel you are *actually* reading?
Hah, well those are the same currently as I'm re-reading Ready Player One before the Spielberg-directed movie releases in 2018. If you like anything about sci-fi and/or 80's pop culture, it's a must-read.
(Editor's note: agreed!)
As somebody who gets to peek behind the curtain, what features are you most excited about in Docker's future?
Docker Engine (the core service that runs your containers) is quite stable at this point, and most of the changes that affect you and me are in higher level tooling around container orchestration, security, and making tools easier to use.
2018 will be the year we see the Docker and Kubernetes communities work even closer together to make multi-server orchestration easier to deploy and manage. The promise of container orchestration is huge, and there's a reason it has been all that the cloud tech news is talking about.
Imagine deploying a single complex solution that might have a dozen different web applications, databases, workers, and API's, but deploy all that with just a single YAML config file and just a few commands (or clicks). And do it on any cloud, in a highly-available and load-balanced way. That's container orchestration.
Docker Swarm (my favorite orchestrator) is quite easy to get started, and I hope we'll see some advancements with built-in persistent data storage, and more advanced load-balancer control. But those are just wishes I have. And hey, this is open source, so keep an eye on the Pull Requests of moby/moby on GitHub and the other toolkit pieces listed in the Moby Project if you want to read the tea leaves.
Overall I think 2018 will be a year for containers to stabilize the lower level pieces on open standards and start to settle around common "stacks" of tools that build a full solution for the "build, ship, run" lifecycle of server apps.