I like containers—I really do. They provide a built-in architecture for and reward a distributed systems approach to development, and they offer portability. Indeed, a good deal of my work involves designing and deploying containers, both for existing and new applications.

But there can be too much of a good thing, and I see signs that containers are used where they don’t fit. It’s common for technologists to chase the hype cycle, and containers are the current flavor of the month. They’re sometimes adopted because they’re trendy, not because they’re the right technology.

As a result, IT is spending too much money and time trying to force-fit an application’s round peg into a container’s round hole.

If there’s a silver lining, containerizing applications that don’t need to be in containers won’t cause them to fail. You’re not risking your operations, only wasting your resources. Given that containers are typically used with cloud deployments, such waste is very ironic: Cost savings are typically the biggest reason to move to the cloud in the first place. Unnecessary containerization steals from those desired savings.