This is going to take some time.

Intended audience

Skyline is not really a localhost application, it needs lots of data, unless you have a localhost Graphite or pickle Graphite to your localhost, however Skyline can run on localhost using Docker (experimental and not for production).

Given the specific nature of Skyline, it is assumed that the audience will have a certain level of technical knowledge, e.g. it is assumed that the user will be familiar with the installation, configuration, operation and security practices and considerations relating to the following components:

  • Graphite
  • Redis
  • MySQL
  • Apache (or nginx although there are no examples here)
  • memcached

However, these types of assumptions even if stated are not useful or helpful to someone not familiar with all the ins and outs of some required thing. This installation document is specifically related to the required installs and configurations of things that are directly related Skyline. Although it cannot possibly cover all possible set ups or scenarios, it does describe recommendations in terms of configurations of the various components and how and where they should be run in relation to Skyline. There are no cfengine, puppet, chef or ansible patterns here.

The documentation is aimed at installing Skyline securely by default. It is possible to run Skyline very insecurely, however this documentation does not specify how to do that. Both the set up and documentation are verbose. Setting up Skyline takes a while.

Skyline’s default settings and documentation are aimed to run behind a SSL terminated and authenticated reverse proxy and use Redis authentication. This makes the installation process more tedious, but it means that all these inconvenient factors are not left as an after thought or added to some TODO list or issue when you decide after trying Skyline, “Yes! I want to Skyline, this is cool”.

For notes regarding automation and configuration management see the section at the end of this page.

What the components do

  • Graphite - sends metric data to Skyline Horizon via a pickle. Graphite is a separate application that will probably be running on another server, although Graphite could run on the same server, in a production environment it would probably be a remote machine or container. Graphite is not part of Skyline.
  • Redis - stores settings.FULL_DURATION seconds (usefully 24 hours worth) of time series data that Graphite sends to Skyline Horizon and Horizon writes the data to Redis. Skyline Analyzer pulls the data from Redis for analysis. Redis must run on the same host as Skyline. It may be possible to run Redis in another container or VM that is on the same host.
  • MySQL/mariadb - stores data about anomalies and time series features profile fingerprints for matching and learning things that are not anomalous. MySQL can run on the same host as Skyline or it can be remote. Running the DB remotely will make the Skyline UI a bit slower.
  • Apache (or nginx) - Skyline serves the webapp via gunicorn and Apache handles endpoint routing, SSL termination and basic http auth. Ideally Apache should be run on the same host as Skyline.
  • memcached - caches Ionosphere MySQL data, memcached should ideally be run on the same host as Skyline.


Use sudo appropriately for your environment wherever necessary.



All the documentation and testing is based on running Skyline in a Python-2.7.16 virtualenv, if you choose to deploy Skyline another way, you are on your own. Although it is possible to run Skyline in a different type of environment, it does not lend itself to repeatability or a common known state. Python-2.7.14 should still work as well, but all documentation has been updated to use Python-2.7.16.

Skyline configuration

All Skyline configuration is handled in skyline/ and in this documentation configuration options are referred to via their docstrings name e.g. settings.FULL_DURATION which links to their description in the documentation.


  • Should you wish to review the build steps, component builds and installs described below, there is a convenience build script for testing purposes only in utils/dawn/ see Dawn section

Python virtualenv


  • Please set up all the firewall rules to restrict access to the following before you continue to install the other components:
    • The IP address and port being used to reverse proxy the Webapp e.g. <YOUR_SERVER_IP_ADDRESS>:443, ensure that this is only accessible to specified IPs in iptables/ip6tables (further these addresses should also be added to the reverse proxy conf as Allow from defines when you create the reverse proxy conf file).
    • The IP address and port being used by MySQL/mariadb, if you are not binding MySQL/mariadb to only, ensure that the MySQL/mariadb port declared in settings.PANORAMA_DBPORT (default 3306) is only accessible to specified IPs in iptables/ip6tables
    • Allow the IP address of your Graphite server/s on ports 2024 and 2025 (the default Graphite pickle ports)
    • The IP address and port being used by Redis, which if you are not running multiple distributed Skyline instances should be, even if you are running multiple distributed Skyline instances this can and still should be as Skyline makes an API endpoint available to remote Skyline instances for any required remote Redis data retrieval and preprocessing.
    • If you are going to run Vista and Flux, ensure that the Skyline IP is allowed to connect to the Graphite node on the PICKLE_RECEIVER_PORT
    • Please ensure you handle all of these with iptables AND ip6tables (or the equivalent) before continuing.


unixsocket /tmp/redis.sock
unixsocketperm 777


The unixsocket on the apt redis-server package is /var/run/redis/redis.sock if you use this path ensure you change settings.REDIS_SOCKET_PATH to this path

  • Ensure Redis has a long requirepass set in redis.conf
  • Ensure Redis bind is set in redis.conf, consider specifically stating bind even if you are going to run multiple distributed Skyline instances, Skyline gets remote Redis data preprocessed via a Skyline API so there is no need to bind Redis to any other IP.
  • Start Redis


  • Install memcached and start memcached see
  • Ensure that you start memcached only bound to by passing the daemon the option -l, Skyline only requires memcached locally.

Skyline directories

  • Make the required directories
mkdir /var/log/skyline
mkdir /var/run/skyline
mkdir /var/dump

mkdir -p /opt/skyline/panorama/check
mkdir -p /opt/skyline/mirage/check
mkdir -p /opt/skyline/crucible/check
mkdir -p /opt/skyline/crucible/data
mkdir -p /opt/skyline/ionosphere/check
mkdir /etc/skyline
mkdir /tmp/skyline


Ensure you provide the appropriate ownership and permissions to the above specified directories for the user you wish to run the Skyline process as.

Skyline and dependencies install

  • git clone Skyline (git should have been installed in the Running in Python virtualenv section) and it is recommended to then git checkout the commit reference of the latest stable release.
mkdir -p /opt/skyline/github
cd /opt/skyline/github
git clone
# If you wish to switch to a specific commit or the latest release
#cd /opt/skyline/github/skyline
#git checkout <COMMITREF>
  • Once again using the Python-2.7.16 virtualenv, install the requirements using the virtualenv pip, this can take some time.


When working with virtualenv Python versions you must always remember to use the activate and deactivate commands to ensure you are using the correct version of Python. Although running a virtualenv does not affect the system Python, not using activate can result in the user making errors that MAY affect the system Python and packages. For example, a user does not use activate and just uses pip not bin/pip2.7 and pip installs some packages. User error can result in the system Python being affected. Get in to the habit of always using explicit bin/pip2.7 and bin/python2.7 commands to ensure that it is harder for you to err.


If you are running on CentOS 6 mysql-connector-python needs to be fixed to 8.0.6 on CentOS 6 as if you use MySQL 5.1 rpm from mainstream, as of mysql-connector-python 8.0.11 support for 5.1 was dropped and results in a bad handshake error. Further to this there is a reported vulnerability with mysql-connector-python-8.0.6 High severity vulnerability found on mysql-connector-python-8.0.6 desc: Improper Access Control info: info: You have been advised, so now you know.


source bin/activate

# As of statsmodels 0.9.0 numpy, et al need to be installed before
# statsmodels in requirements
bin/"pip${PYTHON_MAJOR_VERSION}" install $(cat /opt/skyline/github/skyline/requirements.txt | grep "^numpy\|^scipy\|^patsy" | tr '\n' ' ')
bin/"pip${PYTHON_MAJOR_VERSION}" install $(cat /opt/skyline/github/skyline/requirements.txt | grep "^pandas")

# CentOS 6 ONLY
# mysql-connector-python needs to be fixed to 8.0.6 on CentOS 6 as it uses
# MySQL 5.1 rpm from mainstream, as of mysql-connector-python 8.0.11 support
# for 5.1 was dropped and results in a bad handshake error.
if [ -f /etc/redhat-release ]; then
  CENTOS=$(cat /etc/redhat-release | grep -c "CentOS")
  if [ $CENTOS -eq 1 ]; then
    CENTOS_6=$(cat /etc/redhat-release | grep -c "release 6")
    if [ $CENTOS_6 -eq 1 ]; then
      echo "Replacing mysql-connector-python version in requirements.txt as CentOS 6 requires mysql-connector-python==8.0.6"
      cat /opt/skyline/github/skyline/requirements.txt > /opt/skyline/github/skyline/requirements.txt.original
      cat /opt/skyline/github/skyline/requirements.txt.original | sed -e 's/^mysql-connector-python==.*/mysql-connector-python==8\.0\.6/g' > /opt/skyline/github/skyline/requirements.txt.centos6
      cat /opt/skyline/github/skyline/requirements.txt.centos6 > /opt/skyline/github/skyline/requirements.txt

# This can take lots of minutes...
bin/"pip${PYTHON_MAJOR_VERSION}" install -r /opt/skyline/github/skyline/requirements.txt

  • Copy the skyline.conf and edit the USE_PYTHON as appropriate to your set up if it is not using PATH /opt/python_virtualenv/projects/skyline-py2716/bin/python2.7
cp /opt/skyline/github/skyline/etc/skyline.conf /etc/skyline/skyline.conf
vi /etc/skyline/skyline.conf  # Set USE_PYTHON as appropriate to your setup

Apache reverse proxy

  • OPTIONAL but recommended, serving the Webapp via gunicorn with an Apache reverse proxy.
    • Setup Apache (httpd) and see the example configuration file in your cloned directory /opt/skyline/github/skyline/etc/skyline.httpd.conf.d.example modify all the <YOUR_ variables as appropriate for you environment - see Apache and gunicorn
    • Create a SSL certificate and update the SSL configurations in the Skyline Apache config (or your reverse proxy)
  • Update your Apache (or reverse proxy config) with the X-Forwarded-Proto header.
RequestHeader set X-Forwarded-Proto "https"
  • Add a user and password for HTTP authentication, the user does not have to be admin it can be anything, e.g.
htpasswd -c /etc/httpd/conf.d/.skyline_htpasswd admin


Ensure that the user and password for Apache match the user and password that you provide in for settings.WEBAPP_AUTH_USER and settings.WEBAPP_AUTH_USER_PASSWORD

  • Deploy your Skyline Apache configuration file and restart httpd.

Skyline database

  • Create the Skyline MySQL/mariadb database for Panorama (see Panorama) and Ionosphere.

Skyline settings

The Skyline settings are declared in the file as valid Python variables which are used in code. The settings values therefore need to be defined correctly as the required Python types. Strings, floats, ints, lists and tuples are used in the various settings. Examples of these Python types are briefly outlined here to inform the user of the types.

a_string = 'single quoted string'  # str
another_string = ''  # str
a_float = 0.1  # float
an_int = 12345  # int
a_list = [1.1, 1.4, 1.7]  # list
another_list_of_strings = ['one', 'two', 'bob']  # list
a_list_of_lists = [['server1.cpu.user', 23.6, 1563912300], ['server2.cpu.user', 3.22, 1563912300]]  # list
a_tuple = ('server1.cpu.user', 23.6, 1563912300)  # tuple
a_tuple_of_tuples = (('server1.cpu.user', 23.6, 1563912300), ('server2.cpu.user', 3.22, 1563912300))  # tuple

Required changes to follow.

cd /opt/skyline/github/skyline/skyline


a special settings variable that needs mentioning is the alerter settings.SYSLOG_ENABLED. This variable by default is True and in this mode Skyline sends all anomalies to syslog and Panorama to record ALL anomalies to the database not just anomalies for metrics that have a settings.ALERTS tuple defined. However this is the desired default state. This setting basically enables the anomaly detection on everything with 3-sigma and builds the anomalies database, it is not noisy. At this point in your implementation the distinction between alerts and general Skyline anomaly detection and constructing an anomalies data set must once again be pointed out.

  • For later implementing and working with Ionosphere and setting up learning (see Ionosphere) after you have the other Skyline apps up and running.
  • If you are upgrading, at this point return to the Upgrading page.

Starting and testing the Skyline installation

  • Before you test Skyline by seeding Redis with some test data, ensure that you have configured the firewall/iptables/ip6tables with the appropriate restricted access.
  • Start the Skyline apps
/opt/skyline/github/skyline/bin/horizon.d start
/opt/skyline/github/skyline/bin/analyzer.d start
/opt/skyline/github/skyline/bin/webapp.d start
# And Panorama if you have set up in the DB at this stage
/opt/skyline/github/skyline/bin/panorama.d start
/opt/skyline/github/skyline/bin/ionosphere.d start
/opt/skyline/github/skyline/bin/luminosity.d start
  • Check the log files to ensure things started OK and are running and there are no errors.


When checking a log make sure you check the log for the appropriate time, Skyline can log fast, so short tails may miss some event you expect between the restart and tail.

# Check what the logs reported when the apps started
head -n 20 /var/log/skyline/*.log

# How are they running
tail -n 20 /var/log/skyline/*.log

# Any errors - each app
find /var/log/skyline -type f -name "*.log" | while read skyline_logfile
  echo "#####
# Checking for errors in $skyline_logfile"
  cat "$skyline_logfile" | grep -B2 -A10 -i "error ::\|traceback" | tail -n 60
  echo ""
  echo ""
  • Seed Redis with some test data.


if you are UPGRADING and you are using an already populated Redis store, you can skip seeding data.


if you already have Graphite pickling data to Horizon seeding data will not work as Horizon/listen will already have a connection and will be reading the Graphite pickle.

source bin/activate
"bin/python${PYTHON_MAJOR_VERSION}" /opt/skyline/github/skyline/utils/
  • Check the Skyline Webapp frontend on the Skyline machine’s IP address and the appropriate port depending whether you are serving it proxied or direct, e.g https://YOUR_SKYLINE_IP. The horizon.test.pickle metric anomaly should be in the dashboard after the is complete. If Panorama is set up you will be able to see that in the /panorama view and in the rebrow view as well.
  • This will ensure that the Horizon service is properly set up and can receive data. For real data, you have some options relating to getting a data pickle from Graphite see Getting data into Skyline
  • Check the log files again to ensure things are running and there are no errors.
  • Once you have your settings.ALERTS configured to test them see Alert testing

Configure Graphite to send data to Skyline

Other Skyline components

Automation and configuration management notes

The installation of packages in the requirements.txt can take a long time, specifically the pandas build. This will usually take longer than the default timeouts in most configuration management.

That said, requirements.txt can be run in an idempotent manner, however a few things need to be highlighted:

  1. A first time execution of bin/"pip${PYTHON_MAJOR_VERSION}" install -r /opt/skyline/github/skyline/requirements.txt will timeout on configuration management. Therefore consider running this manually first. Once pip has installed all the packages, the requirements.txt will run idempotent with no issue and be used to upgrade via a configuration management run when the requirements.txt is updated with any new versions of packages (with the possible exception of pandas). It is obviously possible to provision each requirement individually directly in configuration management and not use pip to install -r the requirements.txt, however remember the the virtualenv pip needs to be used and pandas needs a LONG timeout value, which not all package classes provide, if you use an exec of any sort, ensure the pandas install has a long timeout.