LOFAR observation data are available at the Long Term Archive (LTA). The LTA is difficult to navigate and thus, we recommend reading the Long Term Archive Howto on the LOFAR wiki.

Note : Not all data on the LTA is public and you may have to request access from the Solar and Space Weather KSP.

Request the data then download the data with wget.


Raw LOFAR data must undergo a series a preprocessing steps before it can be used for imaging. Namely, these are auto-weighting and calibration.


Raw LOFAR data will have no ‘WEIGHT’ column in its measurement set (MS) thus, it is necessary to do the autoweight step:

DPPP msin=path/to/rawdata.MS msout=path/to/output.MS steps=[] msin.autoweight=true

This calculates the weights for the visibilities from the auto-correlation data of the observation.

The script can batch this process if there is more than one file.

(This step can be skipped if the measurement set is pre-processed by imaging pipeline.)


Calibration of LOFAR data is performed using the Default Preprocessing Pipeline (DPPP).

In order to use DPPP on the LOFAR computing facilities, the following lines must be run:

module load lofar
module load dp3

The measurement set is calibrated with DPPP in three steps:

  • Predict Calibration : use the calibrator observation to predict the gain calibration solution.

  • Apply Calibration : apply the gain calibration solution to the solar observation.

  • Apply Beam : apply the LOFAR beam to the dataset.

These steps are covered in greater detail in the LOFAR Imaging Cookbook. Below we give a short example of how to calibrate a raw LOFAR observation.

Predict Calibration

  1. Create a source.db file with a sky model of the calibrator. Here we use TauA as an example but other skymodels are available from

makesourcedb in=/path/to/TauA.skymodel out=TauA.sourcedb
  1. Run auto-weight with DPPP (Note: this step should be done for both sun and the calibrator):

DPPP msin=/path/to/calibrator.MS

  1. Use the observation of the calibrator to predict the parameters for the calculation applied to the solar observation.

DPPP msin=/path2/to/ \
Msout=. \
steps=[gaincal] \
gaincal.usebeammodel=True  \
gaincal.solint=4 \
gaincal.sources=TauA \
gaincal.sourcedb=TauA.sourcedb \
gaincal.onebeamperpatch=True \

Apply Calibration

  1. Apply the parameters predicted by step (3)

DPPP msin=/path2/to/ \
msout=. \
msin.datacolumn=DATA \
msout.datacolumn=CORR_NO_BEAM \
steps=[applycal] \
applycal.parmdb=/path/to/calibrator-autow.MS/instrument \

Apply Beam

  1. Apply the beam model of the calculation for the LOFAR station:

DPPP msin=sun-autow.MS \
msout=. \
msin.datacolumn=CORR_NO_BEAM \
msout.datacolumn=CORRECTED_DATA \
steps =[applybeam] \

The steps (2)-(5) are integrated in the script to calibrate the MS files in batch.

The script automizes the calibration of LOFAR observations. It generates the parset file for the calibration and runs the corresponding DPPP commad.

Modify the configuration lines in the code:

sources  = 'TauAGG'  # source type
sourcedb = 'taurus_1.sourcedb' # path to the source

sun_MS_dir   = 'MS/' # path to the dir contain sun's MS
calib_MS_dir = 'MS/' # path to the dir contain calibrator's MS

obs_id_sun   = 'L722384' # obsid of the sun
obs_id_calib = 'L701915' # obsid of the calibrator

idx_range_sun  = [32,39] # index range of the subband of the Sun
idx_range_cali = [92,99] # index range of the subband of the Sun

run_step = [0,1,2]; # 0 for predict; 1 for applycal;  2 for applybeam
# [0,1,2] for complete calibration

Run the calibration script simply with:


Inspecting Calibration Solutions

In many cases, solar radio bursts can contaminate the observation of the calibrator source. It is thus highly recommended [1] that the gain calibration solutions obtained with DPPP are visually inspected. The LOFAR Solution Tool (LoSoTo) can be used to plot the calibration solutions for each antenna pair using the cal_solution_plot.parset file below

operation = PLOT
axesInPlot = [time,freq]
axisInTable = ant
axisInCol = pol
soltab = sol000/amplitude000
markerSize = 4
plotFlag = True
doUnwrap = False
operation = PLOT
axesInPlot = [time,freq]
axisInTable = ant
axisInCol = pol
soltab = sol000/phase000
markerSize = 4
plotFlag = True
doUnwrap = True

Use to convert the calibration solutions stored at /path/to/calibrator-autow.MS/instrument to a H5 file compatible with LoSoTo and generate the calibration soultion plots as follows: -v cal_solutions.h5  /path/to/calibrator-autow.MS/instrument
losoto -v cal_solutions.h5 cal_solution_plot.parset


Once the data has been calibrated, it can be imaged. We recommend using WSClean for this. An example of WSClean for the sun:

wsclean -j 40 -mem 30 -no-reorder -no-update-model-required \
-mgain 0.3 -weight briggs 0 -size 512 512 \
-scale 10asec -pol I -data-column CORRECTED_DATA \
-niter 1000 -intervals-out 1 -interval 10 11 \
-name /path/to/prefix \

it is better to keep the parameter -multiscale on for the solar image CLEAN, because the solar radio emission is always extended.

A small cheatsheet for solar WSClean:

C ommand

Par ameter




Number of thread used for CLEAN (can be equal to the number of cores).



Maximum memory limit in percent to the system memory. (Don’t use 100%)


briggs 0.2

Weight for the baselines. Briggs 0 works for most of the situations


2048 2048

Size of the image in pixel.



The scale of one pixel, can be 0.1asec,3asec, 3min, 3deg. This should be less 1/4 the beam size in order to properly sample the beam.



The polarization for cleaning, can be I,Q,U,V.

-mult iscale


Whether to use multiscale in the clean. Better to switch on for extended source.

-data- column


Be sure to use the calibrated data (CORRECTED_DATA).



The iteration of clean, for the sun, 400 is necessary, 1000 can be better, 2000 is enough.

-i nterva ls-out


How many images you want to produce.

-in terval

3000 4000

The index range for the CLEAN.

For the interval index, one can use the to find out the starting and ending index.


WSClean produces fits image with astronomy coordinate [RA,DEC] and the unit of Jy/Beam, the module lofarSun.IM can transform the coordinate to heliocentric frame and convert the flux to brightness temperature distribution according to the equation given in the Equation given in Flux intensity.

A demo of visualizing lofar interferometry : demo

For the use of jupyterlab, port forwarding can be done with:

ssh -L 1234:localhost:1234  username@server_address

source /data/scratch/zhang/

python -m jupyter notebook --no-browser --port=1234

Change username and 1234 accordingly.


Above steps requires LOFAR software, which is not easy to install. We can use docker to run steps.

For calibration we use the docker image from here.

$ docker run --rm -it lofaruser/imaging-pipeline:latest

(in docker) $ source /opt/lofarsoft/

(in docker) $ DPPP --version

For Visualization we use the docker image “peijin/lofarsun”

$ docker run --rm --hostname lofarsoft -p 8899:8899 \
    -v /HDD/path/to/data/:/lofardata peijin/lofarsun \
    /bin/bash -c "jupyter-lab --notebook-dir=/lofardata \
    --ip='*' --port=8899 --no-browser --allow-root"

This command will start a jupyter lab server in the docker container, also mount the directory ‘/HDD/path/to/data/’ to ‘/lofardata’

lofarSun.IM.get_peak_beam_from_psf(fname, thresh=0.618)

Get the peak beam from a fits file


The imaging data processing pipeline based on LINC .


The tool is packaged as a singularity container, no need to install and configure the dependencies and software.

To pull the container:

singularity pull docker://peijin94/lincsun:latest

To run the container as a shell:

singularity -B /my/data:/my/data shell lincsun_latest.sif

Prepare data and src

Data can be downloaded from LTA here.

Assuming the data is stored in the directory ‘/path/to/data’, with all MS files in the subdirectory ‘MS’.

The data processing directory is ‘/path/to/proc’.

Download the source code:

cd /path/to/proc
git clone
cp -r ./LOFAR-Sun-tools/utils/IM/LINC ./

Calibration overview

The workflows and steps of the data procedure is described in CWL files, can be found in the directory ‘LINC/lincSun’

We need to prepare a json file to describe the data and the parameters for the data processing.

For example:

   "msin": [
            "class": "Directory",
            "path": "/path/to/data/MS/Data001.MS"
            "class": "Directory",
            "path": "/path/to/data/MS/Data001.MS"
   "ATeam_skymodel": {
      "class": "File",
      "path": "/path/to/somewhere/else/A-Team_lowres.skymodel"
   "refant": "CS002LBA"

This example json file tells the workflow to process Data001.MS and Data002.MS, with the A-Team skymodel and the reference antenna CS002LBA.

Then we can run the workflow with cwltool command inside the container “LINC”, for example:

singularity exec -B /path/to/data/ /path/to/contianer/linc_latest.sif \
   cwltool --outdir /path/to/proc/save /path/to/proc/LincSun/workflow/calibrator_sun.cwl \

There are two steps in the workflow: gaincal and applycal, both are done with above way.

The directory or file list of the calibrator or target could be very long when there are many subbands, we can use the script ‘’ to create the list of files.


  • collect the data for the calibrator files (measurement set files xxx.MS), could be done with the script ‘’

  • prepare the script to run the cwl workflow, for example:

#SBATCH --ntasks=1
#SBATCH --cpus-per-task=20
#SBATCH --account=xxxxxxxxxx
#SBATCH --mem=128G
#SBATCH --partition=xxxxxxxxxxxxxxxxxxx
#SBATCH --time=23:59:00
#SBATCH --mail-user=xxxxxxxxxxxxxxxxxxxxxxxx

export SINGULARITY_TMPDIR=$mytmpdir
export TMPDIR=$mytmpdir
export TEMP=$mytmpdir
export TMP=$mytmpdir

singularity exec -B /path/to/data -B $PWD $procDir/linc_latest.sif \
      cwltool  --no-container --basedir $procDir \
      --outdir $procDir/results/ --log-dir $procDir/logs/ \
      --tmpdir-prefix $mytmpdir \
      $procDir/lincSun/workflow/calibrator_sun.cwl $procDir/calibLBA.json

The first a few lines of ‘#SBATCH’ is for HPC users, ignore if you are running on local machine or single node.

This will create inspection plots for all antennas.

Please check the phase and amplitude solutions for each antenna.

If there is very bad antenna, we need to do one more extra step to flag out the corrupted antenna.

Successful run will yield a solution.h5 file in the results directory


Apply the solution file to the target MS