These instructions are written for installation on a Linux system. Where also works with Windows, but is a lot tested on this platform and no instructions are provided.

Download Where

In order to install Where, you first need to download it. Please Download the latest version of Where, and unpack it at a convenient location on your computer.

All commands shown in this guide should be performed in a terminal from inside the where-directory you just downloaded and unpacked.

Prerequisites

The Where software is mainly written in Python. As such, it is dependent on having a modern Python (>= 3.6) installed on your system, including several packages from the scientific stack. Where also utilizes Cython (essentially a Python-to-C-translator) for better performance. Finally, Where depends on some Fortran libraries with community implementations of conventional models.

Step 1: Install Anaconda Python

We recommend using Anaconda Python. This is a Python distribution that comes with its own package manager (conda) that makes it easier to install other Python packages, especially on Windows.

An alternative distribution of Anaconda is Miniconda. It comes with the same Conda package manager, but only installs a minimal Python system by default. As such Miniconda is a great choice for instance on servers.

Install Anaconda or Miniconda following the instructions on their respective web pages.

Step 2: Install Necessary Python Dependencies

Once Anaconda or Miniconda is installed, you should install the Python packages needed by Where by doing:

conda env update -f environment.yml

Step 3: Install a Fortran Compiler

For Linux gfortran is needed.

Step 4: Install a C Compiler

For Linux gcc is needed.

Step 5: Other dependencies

For Linux the following packages are needed:

  • patch

Installing Where

The actual installation of Where happens in three steps:

  • Compile the external Fortran libraries and build them as a Python module using f2py.
  • Compile and build the Cython code.
  • Install Where on your computer.

All these three steps are carried out automatically by running the Make-command:

make

You can confirm that the installation succeeded by asking for help about Where:

where -h

If you get an error saying that Where is an unknown command, you most likely need to add Where to your path.

Note: On Windows the command to start Where is gd_where. In the example above you should run gd_where -h.

Set Up Paths Used by Where

Where uses a few different directories. The location of these are controlled by the Where configuration. You should create a new text file called where_local.conf inside the where/config directory.

Add the following contents and save the file:

[path]
data     = ~/where_data
work     = ~/where_work
log      = ~/where_log
publish  = ~/where_publish

This specifies the location of four different directories:

  • data: The data directory will contain all data needed to analyze a session, including observation files, reference frames, and a priori model information.
  • work: The work directory will contain all active analysis files, including configuration, analysis datasets and output products.
  • log: The log directory will contain log files with information from running many Where analyses. Log files for single sessions are stored in their respective directories inside the work directory.
  • publish: It is possible to publish some of the result files. Publishing a file simply means that it will be copied to some location within the publish directory.

Using the default above, these directories are created as four separate directories named where_data, where_work, where_publish, and where_log inside your home directory. Feel free to change these according to your taste.

Obtaining data files

Where has functionality for automatically downloading the data files that it needs when it is running.

You should however download the observation files you want to analyze separately. In the example below we assume that you specified the data directory to be ~/where_data/. Change the path if you specified a different data directory above:

Analyzing vgosDB Format

The following downloads the observation file for January 2nd 2018:

mkdir -p ~/where_data/vlbi/obs/vgosdb/2018 
cd ~/where_data/vlbi/obs/vgosdb/2018 
wget https://cddis.gsfc.nasa.gov/pub/vlbi/ivsdata/vgosdb/2018/18JAN02XA.tgz
tar -xvf 18JAN02XA.tgz

You can then analyze this file by running:

where 2018 1 2 --vlbi --session_code=R1823

Analyzing NGS Format

Note: New NGS files are no longer produced and the recommended file format for observations files is vgosDB.

The following downloads the observation file for January 2nd 2018:

mkdir -p ~/where_data/vlbi/obs/ngs/2018 
cd ~/where_data/vlbi/obs/ngs/2018 
wget https://cddis.gsfc.nasa.gov/pub/vlbi/ivsdata/ngs/2018/18JAN02XA_N004.gz

You can then analyze this file by running:

where 2018 1 2 --vlbi --session_code=R1823 --obs_format=ngs

Other files

Where will automatically try to download a version of all other files needed to do an analysis. This is to facilitate an easy setup for beginners. However, some sites requires a personal account to be able to download the necessary files. One example is files from CDDIS. Where will give an error message if any files are missing when running the progam. Investigate these error messages, create personal accounts when needed and download the missing files. Some files also need to be updated routinely, for instance files with Earth Orientation Parameters. These will not be updated automatically. See config/files.conf and config/files_pipeline_vlbi.conf for information about files used by Where.

External libraries

Where depends on three external libraries that are not included in the download. These are all available online and have already been downloaded automatically by the make-process.

SOFA

The SOFA (Standards of Fundamental Astronomy) software library is used for fundamental astronomy calculations, such as Celestial to Terrestrial transformations.

The latest version of the library should be downloaded from the SOFA web page and stored in the external/sofa/ directory. This can be done automatically by running

python download.py sofa

The library depends on a catalog of leap seconds, and should be updated regularly.

IERS software

In companionship with the IERS Convention 2010, the IERS has made available implementations of several conventional models. Where uses these models directly whenever possible.

The latest version of the library should be downloaded from the IERS web page and stored in the external/iers/ directory. This can be done automatically by running

python download.py iers_2010

The library depends on a catalog of leap seconds, and should be updated regularly. Before compilation of the iers_2010 library some of the source code and make files needs to be patched. Therefore, make sure patch (Linux) is installed.

GPT2w

One of the available troposphere models in Where is the GPT2w. This is implemented using the original GPT2w model by Böhm, J., Möller, G., Schindelegger, M. et al. GPS Solut (2015) 19: 433.

The latest version of the Fortran library should be downloaded from the GPT2w web page and stored in the external/gpt2w/ directory. This can be done automatically by running

python download.py gpt2w

High Frequency EOP software

A IERS working group on Diurnal and Semi-dirunal EOP Variations has made some software and models available at https://ivscc.gsfc.nasa.gov/hfeop_wg/.

The latest version of the Fortran library should be downloaded from the working group web page and stored in the external/iers/hf_eop/ directory. This can be done automatically by running

python download.py hf_eop