Squid: Configuring an Ubuntu host to use a Squid proxy for internet access

Once you have a Squid proxy setup as described in my article here, the next challenge is configuring your Ubuntu servers so that they use this proxy by default instead of attempting direct internet connections.

There are several entities we want using Squid by default: apt package manager, interactive consoles and wget/curl, and Java applications.

Continue reading “Squid: Configuring an Ubuntu host to use a Squid proxy for internet access”

Ubuntu: Silent package installation and debconf

If you have worked on deploying packages via apt-get, you are probably familiar with a couple of forms of interruption during the package installation and upgrade process.

The first is the text menu shown during package upgrades that informs you that a new configuration file is available and asks if you want to keep your current one, use the new one from the package maintainer, or show the difference.

The second is the occasional ASCII dialog that interrupts the install/upgrade and ask for essential information before moving forward.  The screenshot below is the dialog you get when installing MySQL or MariaDB, asking to set the initial root password for the database.

The problem, in this age of cloud scale, is that you often need completely silent installations and upgrades that can be pushed out via Configuration Management.  Even if this is a build for an immutable image, you would prefer a completely automated construction process instead of manual intervention each time you build an image.

Continue reading “Ubuntu: Silent package installation and debconf”

Ubuntu: Installing Packages without Public Internet Access

ubuntuIn production data centers, it is not uncommon to have limited public internet access due to security policies.  So while running ‘apt-get’ or adding a repository to sources.list is easy in your development lab, you have to figure out an alternative installation strategy because you need a process that looks the same across both development and production.

For some, building containers or images will satisfy this requirement.  The container/image can be built once in development, and transferred as an immutable entity to production.

But for those that use automated configuration management such as Salt/Chef/Ansible/Puppet to layer components on top of a base image inside a restricted environment, there is a need to get binary packages to these guest OS without requiring public internet access.

There are several approaches that could be taken: using an offline repository or a tool such as Synaptic or Keryx or apt-mirror, but in this post I’ll go over using apt-get on an internet connected source machine to download the  necessary packages for Apache2, and then running dpkg on the non-connected target machine to install each required .deb package and get a running instance of Apache2.

Note that this solution only addresses the apt packages.  If you need to pull down Javascript packages from npm or Python modules from pypi,  then you might want to look at my article on using a squid proxy to whitelist specific URL.

Continue reading “Ubuntu: Installing Packages without Public Internet Access”