Zabbix: Installing a Zabbix Agent on Ubuntu 14.04

The open-source Zabbix monitoring solution has very lightweight agents that are easy to install on Ubuntu.

Although the Ubuntu main repository has a build available, it is older and so we are going to choose to download and install the latest point version in this article.  Unfortunately, the repo.zabbix.com cannot be added directly as an Ubuntu repository source.

Continue reading “Zabbix: Installing a Zabbix Agent on Ubuntu 14.04”

ELK: ElasticDump and Python to create a data warehouse job

By nature, the amount of data collected in your ElasticSearch instance will continue to grow and at some point you will need to prune or warehouse indexes so that your active collections are prioritized.

ElasticDump can assist in moving your indexes either to a distinct ElasticSearch instance that is setup specifically for long term data, or exporting the data as json for later import into a warehouse like Hadoop.  ElasticDump does not have a special filter for time based indexes (index-YYYY.MM.DD), so you must specify exact index names.

In this article we will use Python to query a source ElasticSearch instance (an instance meant for near real-time querying, keeps minimal amount of data), and exports any indexes from the last 14 days into a target ElasticSearch instance (an instance meant for data warehousing, has more persistent storage and users expect multi-second query times).

Continue reading “ELK: ElasticDump and Python to create a data warehouse job”

ELK: Using Curator to manage the size and persistence of your index storage

The Curator product from ElasticSearch allows you to apply batch actions to your indexes (close, create, delete, etc.).  One specific use case is applying a retention policy to your indexes, deleting any indexes that are older than a certain threshold.

Installation

Start by installing Curator using apt and pip:

$ sudo apt-get install python-pip -y

$ sudo pip install elasticsearch-curator

$ /usr/local/bin/curator --version

Continue reading “ELK: Using Curator to manage the size and persistence of your index storage”

VirtualBox: Installing VirtualBox and Vagrant on Ubuntu 14.04/16.04

Although container based engines such as Docker are highly popularized for newer application deployment – there will still be widespread use of OS virtualization engines for years to come.

One of the most popular virtualization engines for development purposes is the open-source VirtualBox from Oracle.  This article will detail its installation on Ubuntu 14.04.

Continue reading “VirtualBox: Installing VirtualBox and Vagrant on Ubuntu 14.04/16.04”

Docker: Sending Spring Boot logging to syslog

Building services using Spring Boot gives a development team a jump start on many production concerns, including logging.  But unlike a standard deployment where logging to a local file is where the developer’s responsibility typically ends, with Docker we must think about how to log to a public space outside our ephemeral container space.

The Docker logging drivers capture all the output from a container’s stdout/stderr, and can send a container’s logs directly to most major logging solutions (syslog, Logstash, gelf, fluentd).

As an added benefit, by making the logging implementation a runtime choice for the container, it provides flexibility to use a simpler implementation during development but a highly-available, scalable logging solution in production.

Continue reading “Docker: Sending Spring Boot logging to syslog”

Spring: Spring Boot with SLF4J/Logback sending to syslog

The Spring framework provides a proven and well documented model for the development of custom projects and services. The Spring Boot project takes an opinionated view of building production Spring applications, which favors convention over configuration.

In this article we will explore how to configure a Spring Boot project to use the Simple Logging Facade for Java (SLF4J) with a Logback backend to send log events to the console, filesystem, and syslog.

Continue reading “Spring: Spring Boot with SLF4J/Logback sending to syslog”

Docker: Installing Docker CE on Ubuntu 14.04

Docker is a container platform that streamlines software delivery and provides isolation, scalability, and efficiency with less overhead than OS level virtualization.

These instructions are taken directly from the official Docker for Ubuntu page, but I wanted to reiterate those tasks essential for installing the Docker Community Edition on Ubuntu 14.04.

Continue reading “Docker: Installing Docker CE on Ubuntu 14.04”

Squid: Configuring an Ubuntu host to use a Squid proxy for internet access

Once you have a Squid proxy setup as described in my article here, the next challenge is configuring your Ubuntu servers so that they use this proxy by default instead of attempting direct internet connections.

There are several entities we want using Squid by default: apt package manager, interactive consoles and wget/curl, and Java applications.

Continue reading “Squid: Configuring an Ubuntu host to use a Squid proxy for internet access”

Squid: Controlling network access using Squid and whitelisted domains

Having your production servers go through a proxy like Squid for internet access can be an architectural best practice that provides network security as well as caching efficiencies.

For further security, denying access to all requests but an explicit whitelist of domains provides auditable control.

Continue reading “Squid: Controlling network access using Squid and whitelisted domains”

HAProxy: Using HAProxy for SSL termination on Ubuntu

HAProxy is a high performance TCP/HTTP (Level 4 and Level 7) load balancer and reverse proxy.  A common pattern is allowing HAProxy to be the fronting SSL-termination point, and then HAProxy determines which pooled backend server serves the request.

Continue reading “HAProxy: Using HAProxy for SSL termination on Ubuntu”

Nginx: Using Nginx for SSL termination on Ubuntu

Nginx is a popular reverse proxy and load balancer that focuses on level 7 (application) traffic.  A common pattern is allowing Nginx to be the fronting SSL-termination point, and then Nginx determines which pooled backend server is best available to serve the request.

Continue reading “Nginx: Using Nginx for SSL termination on Ubuntu”

Apache2: Enable LDAP authentication and SSL termination for Ubuntu

Some web applications leave authentication as an orthogonal concern to the application – not including any kind of login functionality and instead leaving authentication as an operational concern.

When this happens, a reverse proxy that has an LDAP integration can act as an architectural sentry in front of the web application and also fulfills the requirements for Single Sign-On.  Apache2 serves this purpose very well with minimal overhead.

Continue reading “Apache2: Enable LDAP authentication and SSL termination for Ubuntu”

Ubuntu: Creating a self-signed certificate using OpenSSL on Ubuntu

There are numerous articles I’ve written  where a self-signed certificate is a prerequisite for deploying a piece of infrastructure.

Here are the quick steps for installing a self-signed certificate on an Ubuntu server.  First we create the destination directory and make sure we have the ssl packages.

Continue reading “Ubuntu: Creating a self-signed certificate using OpenSSL on Ubuntu”

Jenkins: Setting up a continuous integration server on Ubuntu

Jenkins is the open-source automation server that is critical in building a continuous integration and delivery pipeline.  It is extensible and has a wealth of plugins that  integrate with numerous enterprise systems.

Here are the detailed steps for installing a Jenkins server on Ubuntu.

Continue reading “Jenkins: Setting up a continuous integration server on Ubuntu”

Maven: Installing a 3rd party jar to a local or remote repository

Especially in enterprise application development, there can be 3rd party dependencies that are not available in public Maven repositories.  These may be internal, business specific libraries or licensed libraries that have limitations on usage.

When this is the case, you can either publish to a private Maven repository that controls authorization or you can put them into your local cached maven repository.

Continue reading “Maven: Installing a 3rd party jar to a local or remote repository”

Maven: Installing a private Maven repository on Ubuntu using Artifactory

An essential part of the standard build process for Java applications is having a set of repositories where project artifacts are stored.

Artifact curation provides the ability to manage dependencies, quickly rollback releases, support compatibility of downstream projects, do QA promotion from test to production, support a continuous build pipeline, and provides auditability.

JFrog puts out an open-source Maven server called Artifactory that is perfect for setting up a private Maven repository for internal applications.

Continue reading “Maven: Installing a private Maven repository on Ubuntu using Artifactory”

Monitoring: Java JMX exploration from the console using jmxterm

Java JMX (Java Management Extensions) is a standardized way of monitoring Java based applications.  The managed resources (MBeans) are defined and exposed by the JVM, application server, and application – and offer a view into these layers that can provide invaluable monitoring data.

But in order to report back the JMX data you must know the fully expanded path of the MBean and it’s available attributes/operations.  If you are on a desktop, tools like jsonsole provide a nice GUI interface for drilling down into the MBean hierarchy.  But, if you are in a server environment and JMX is not enabled for remote access on a desktop, you may need a console alternative.

An open-source project call jmxterm comes packaged as a single uber jar that makes it easy to enumerate and explore the available MBean exposed in a Java based application.

Continue reading “Monitoring: Java JMX exploration from the console using jmxterm”

Ubuntu: Using strace to get a view into file and network activity of a process

strace is a handy utility for tracing system, file, and network calls on a Linux system.  It can produce trace output for either an already running process, or it can create a new process.

Some of the most common troubleshooting scenarios are needing to isolate either the network or file system activity of a process.  For example to determine whether an application was attempting to reaching out to a server on the expected port, or to understand why a startup configuration file was not being read from the expected directory.

Continue reading “Ubuntu: Using strace to get a view into file and network activity of a process”

Ubuntu: Using tcpdump for analysis of network traffic and port usage

tcpdump comes standard on Ubuntu servers and is an invaluable tool in determining traffic coming in and out of a host.

As network infrastructures have become more complex and security conscious, validating network flow from client hosts through potentially multiple proxies and ultimately to a destination host and port has become more important than ever.

Let me list a few of the more common use cases.

Continue reading “Ubuntu: Using tcpdump for analysis of network traffic and port usage”

Nginx: Custom access log format and error levels

Nginx is a powerful application level proxy server.  Whether for troubleshooting or analysis, enabling log levels and custom formats for the access/error logs is a common requirement.

Error Logs

By default, only messages in the error category are logged.  If you want to enable more details, then modify nginx.conf like:

error_log file [level]

Enabling debug level on Linux would usually look like:

error_log /var/log/nginx/error.log debug;

Access Logs

Access logs and their format are also customized in nginx.conf.  By default, if no format is specified then the combined format is used.

access_log file [format]

Continue reading “Nginx: Custom access log format and error levels”

AppDynamics: Enabling verbose debug logs for Agents

Enabling verbose logs for an AppDynamics machine or database agents can be invaluable for troubleshooting connectivity or network issues.

Luckily, this is easily done by editing the conf/logging/log4j.xml file.  By default, only the error level messages are sent to the logs:

<root>
  <priority value="error"/>
  <appender-ref ref="FileAppender"/>
</root>

But you can modify this so that debug level is sent:

<root>
  <priority value="debug"/>
  <appender-ref ref="FileAppender"/>
</root>

Continue reading “AppDynamics: Enabling verbose debug logs for Agents”

OpenWrt: Upgrading OpenWrt to the latest snapshot build

Although stables releases of OpenWrt come out every 6 to 12 months, the automatically built snapshots offer a way to embrace the latest features, patches, and  security fixes without waiting that long.

A sysupgrade procedure works by saving the configuration files from known locations, deleting the entire filesystem, installing the new version of OpenWrt,  and then restoring the configuration files.

This is usually painless, but there can be issues if configuration changes have been made in non-standard file locations and are not saved.  Additionally, custom packages do not survive the sysupgrade and have to be reinstalled (to ensure compatibility with the kernel) and their new configurations must be manually merged.

Continue reading “OpenWrt: Upgrading OpenWrt to the latest snapshot build”

PingIdentity: Disabling SSLv3 and weak ciphers for PingFederate

The PingFederate server provides best-in-class Identity Management and SSO.  However, due to US laws governing export of cryptography, the default SSL protocols and cipher suites need to be configured to harden the solution.

Below are the steps involved with making these post-installation changes.

Continue reading “PingIdentity: Disabling SSLv3 and weak ciphers for PingFederate”

AppDynamics: Java Spring PetClinic and PostgreSQL configured for monitoring

As an exploration of AppDynamics’ APM functionality, you may find it useful to deploy a sample application that can quickly return back useful data.  The Java Spring PetClinic connecting back to a PostgreSQL database provides a simple code base that exercises both database and application monitoring.

In a previous article, I went over the detailed steps for monitoring PetClinic with a MySQL backend, so I will refer back to that article for some of the details and will focus on the PostgreSQL specific steps here.

Continue reading “AppDynamics: Java Spring PetClinic and PostgreSQL configured for monitoring”

OpenSSL: Using OpenSSL to enumerate protocols and ciphers in use by web applications

While enabling HTTPS is a important step in securing your web application, it is critical that you also take steps to disable legacy protocols and low strength ciphers that can circumvent the very security you are attempting to implement.

As long as you have the latest version of openssl then you should be able to use a bash script like below (credit for this script goes here) to enumerate every matching protocol and cipher that a server is exposing:

Continue reading “OpenSSL: Using OpenSSL to enumerate protocols and ciphers in use by web applications”

Selenium: Running headless automated tests on Ubuntu

Selenium is an open-source solution for automating the browser allowing you to run continuous integration tests, validate performance and scalability, and perform regression testing of web applications.

This kind of automated testing is useful not only from desktop systems, but also from server machines where you may want to monitor availability or correctness of returned pages.  For example, web site response monitoring or as part of a Jenkins validation pipeline.

The first method we can use to accomplish this is to use a headless driver such as the HtmlUnit or PhantomJS driver – these are tiny browser implementations that load and execute web pages but do not actually draw the results to a screen.

The second method is specific to Linux based systems, where you use the actual Chrome browser.  The trick is to use Xvfb as a virtualized display.

Continue reading “Selenium: Running headless automated tests on Ubuntu”

Ubuntu: Silent package installation and debconf

If you have worked on deploying packages via apt-get, you are probably familiar with a couple of forms of interruption during the package installation and upgrade process.

The first is the text menu shown during package upgrades that informs you that a new configuration file is available and asks if you want to keep your current one, use the new one from the package maintainer, or show the difference.

The second is the occasional ASCII dialog that interrupts the install/upgrade and ask for essential information before moving forward.  The screenshot below is the dialog you get when installing MySQL or MariaDB, asking to set the initial root password for the database.

The problem, in this age of cloud scale, is that you often need completely silent installations and upgrades that can be pushed out via Configuration Management.  Even if this is a build for an immutable image, you would prefer a completely automated construction process instead of manual intervention each time you build an image.

Continue reading “Ubuntu: Silent package installation and debconf”

AppDynamics: Java Spring PetClinic and MySQL configured for monitoring

As an exploration of AppDynamics’ APM functionality, you may find it useful to deploy a sample application that can quickly return back useful data.  The Java Spring PetClinic connecting back to a MySQL database provides a simple code base that exercises both database and application monitoring.

We’ll deploy the Java Spring PetClinic unto Tomcat running on Ubuntu 14.04.  MySQL will be the backing persistence engine for the web application.  The AppDynamics Java agent will be loaded into the JVM running Tomcat, and the AppDynamics Database Agent will connect to MySQL for metrics gathering.

Continue reading “AppDynamics: Java Spring PetClinic and MySQL configured for monitoring”

AppDynamics: Installing a Machine Agent on Ubuntu 14.04

The AppDynamics Machine Agent is used not only to report back on basic hardware metrics (cpu/memory/disk/network), but also as the hook for custom plugins that can report back on any number of applications including: .NET, Apache, AWS, MongoDB, Cassandra, and many others.

In this article, I’ll go over the details to install the Machine Agent unto an Ubuntu 14.04 system.

Continue reading “AppDynamics: Installing a Machine Agent on Ubuntu 14.04”

Grafana: Connecting to an ElasticSearch datasource

The ElasticSearch stack (ELK) is popular open-source solution that serves as both repository and search interface for a wide range of applications including: log aggregation and analysis, analytics store, search engine, and document processing.

Its standard web front-end, Kibana, is a great product for data exploration and dashboards.  However, if you have multiple data sources including ElasticSearch, want built-in LDAP authentication, or the ability to annotate graphs, you may want to consider Grafana to surface your dashboards and visualizations.

Continue reading “Grafana: Connecting to an ElasticSearch datasource”