Table of Contents
LWA1 Commissioning Scripts
Overview and Requirements
This is a collection of scripts use for commissioning and testing at the LWA1 station. All of the scripts depend on the LSL 0.6.x or 1.0.x branches.
The Commissioning scripts build off of the LSL 0.6.x/1.0.x framework for working with LWA data and also depends on the following additional Python modules:
No installation (e.g., python setup.py install) is required to use the software. Simply run make in the Commissioning/DRX/HDF5 directory and then use the scripts in the Commissioning directory.
The easiest way to obtain the latest version of the commissioning scripts is via subversion export/checkout:
svn checkout http://fornax.phys.unm.edu/lwa/subversion/trunk/Commissioning
Shell script to run on Linux machines with attached and powered-on DRSU to report the firmware version of Seagate drives with model number "ST31000525SV". This is useful for identifying disks with the newer (and less reliable) firmware version "CV12".
Given a date/time in the format of "YYYY/MM/DD HH:MM:SS[.SSS]" in local time, compute the corresponding MJD and MPM in UTC. If no date/time is specified, the current local date/time is used for the conversion.
For a given MJD value or list of MJD values, return the range of local times associated with that MJD.
List the rise, transit, and set times for the brightest radio objects in the sky.
Script to estimate the data volume of TBN and DRX observations.
Simple script to locate and label where stands are in the array.
Short script to read in a shelter.txt file (log of shelter temperature with each line consisting of a UNIX time stamp and a shelter temperature) and plot all of the available data.
Update: This script is also now compatiable with the /data/thermometer##.txt files created by the new SHL MCS.
Short script to plot the input PDU voltages over time from the /data/rack##.txt files generated by the new SHL MCS software.
Short script to read in a temp.txt file (log of DP FPGA temperatures) and plot all of the available data.
Script to plot colormaps of the FPGA temperatures as a function of time, chassis, and physical slot.
Script for checking for missing frames and bad time tags in a full DP TBW capture. If problems are found the error are noted and printed out. It shouldn't be too hard to modify this for fewer inputs.
Export select stands from a TBW file to HDF5.
Modified version of tbwSpectra.py that is included with LSL (version >= 0.4.0) that estimated the frequency (in MHz) of each dipole's resonance point and saves that information along with time-average spectra to a NPZ file.
Modified version of stationMaster.py that uses Numpy memory maps to work on systems with less than 16 GB of memory.
Modified version of stationMaster.py that uses the 'ClipLevel?' keyword in LSL >= 0.4.2 to blank impulsive RFI events.
Modified version of stationMaster2.py that uses Numpy memory maps to work on systems with less than 16 GB of memory.
GUI that interfaces with a NPZ file created by stationMaster.py that makes looking for problems with dipoles/mappings/etc. a point-and-click exercise.
Script to take a single TBW capture and create a RFI-centered HDF5 file for stands 1, 10, 54, 248, 251, and 258 (the outlier). These stands correspond to the four corners of the array, the center, and the outlier. The HDF5 contains values for the spectral kurtosis estimated from the data and various statistics about the timeseries (mean, std. dev., percentiles, etc.)
Script to take the output of rfiCheck.py and make a series of plots to look at.
Simple script for making sure that time tags update properly in a TBN file. The script not only checks time tags but also checks for dropped packets and synchronization problems.
Create an eye diagram for some portion of a TBN file.
Look for glitches in a TBN file by fitting a quantized sine wave to the data.
Script to plot out time series I/Q data from a TBN file.
Script to skip through a TBN file and check for clipping on stand 10.
Look at what is coming out of the TBN ring buffer and plot of what frames are missing, if any, as a function of time in the file.
Take the status code returned by a DP_ BOARD_STAT query and break it down into its constituent parts.
JPL-created script for creating gain files to be used by DRX.
JPL-create script for created delay files to be used by DRX.
Script for setting up a test DRX observation with a variety of tuning, frequency, bandwidth, and gain settings. Needs gain.py and delay.py to work.
Generate gain and delay files for a particular topocentric pointing (azimuth and elevation in degrees) at a specified frequency.
Use the LSL misc.beamformer's integer delay and sum to estimate the beam shape towards a particular azimuth and elevation and save the beam to a Numpy NPZ file.
Using the beam created by estimateBeam.py, generate a temperature as a function of time for that pointing over an entire day. This script is based on the LSL driftcurve.py script.
Simple script to check the time tag sequence in a DRX file and look for strange jumps and other badness. It can also be used to check the DRX decimate (sample rate) field and if the time has been set correctly on the various boards.
Check for an offset in the time tags associated with the two tunings beyond what is controlled by the time offset stored in the frame header. This should read in the first complete set of frame from all four beamtunepols and compare tunings 1 and 2. Any differences between the times are noted.
Given a a DRX file with both tunings set to the same parameters, check for coherency across the tunings via cross-correlation. For each set of data, print out the lag (in DRX samples) where the cross-correlation function has its peak and the normalized correlation magnitude relative to the previous set of tuning 1 data.
Given two or more DRX files from different beams, check for coherency between the beams and make sure that the beams agree with the T_NOM values. For each beam/tuning/pair, print out the lag (converted to ticks), the time tag differences (both raw and T_NOM corrected) and the normalized correlation magnitude relative to the same-beam tuning cross-correlation.
Script for plotting time series data from a DRX file. This file is also available as part of the LSL package (version >= 0.4.0) but is included here since this is the script we actually used at JPL for pre-acceptance testing.
Modified version of drxTimeseries.py that plots the power for each DRX sample instead of the raw I/Q values. The power is summed for the integration time (listed as average) and displayed for the duration that it is computed.
Script for inspecting one second of DRX data every 15 minutes to check the gain settings.
Script for inspecting DR spectrometer files to check the gain settings.
Create an eye diagram for some portion of a DRX file.
Look for glitches in a DRX file by fitting a quantized sine wave to the data.
DRX-Mode Dipole-Dipole and Beam-Dipole Fringing
Read in SSMIF file and create a set of DRX gain files (zero-delay for dipoles-only) that puts a single dipole or beam on the X pol and the outlier on the other. The gain files are constructed such that all data is from X pol.
SDF-generating version of fringeSets.py.
Script to fringe DRX files that have one dipole on X pol. and another dipole (the outlier probably) on Y pol. Accepts three command line arguments:
- stand number on X pol.
- stand number on Y pol.
- DRX filename(s)
At the end of the correlation, the visibilities are written to a NPZ file.
Similar to fringeDipole.py, but expects the beam to be on X pol. and the dipole on
Y pol. Accepts two command line arguments:
- stand number on Y pol.
- DRX filename(s)
Simple script to load in a collection of NPZ files generated by fringeDipole.py/ fringeBeam.py and plot the visibility amplitude over time.
Similar to plotFringes.py but displays a waterfall of the visibility and includes the integrated bandpass.
Given a collection of NPZ files generated by fringeDipole.py, simulate the fringes using the lsl.sim.vis.buildSimData() function and the bright sources listed in lsl.sim.vis.srcs.
DRX Processing Scripts with HDF5 Output
Version of drxWaterfall.py that writes to HDF5 rather than NPZ to provide an easier path to getting the data into other packages, e.g., Matlab. This script supports both linear polarization products and Stokes parameter. The HDF5 structure created by the script is listed here.
Script to check the flow of time in a DR spectrometer data file.
Script to scan a DR spectrometer file and look at the data quality.
Simple script to convert a binary DR spectrometer file into an HDF5 file that is compatible with the above HDF5 organization.
Read in a DRX/HDF5 watefall file and split out various observations. The observations can be split by:
- Target Source
- Observation ID number
or the script can be used to just list the observations within an HDF5 file.
Read in a DRX/HDF5 wataerfall file and calculate the pseudo-spectral kurtosis (pSK) for each observation (XX and YY or I). The pSK values are "pseudo" since the spectra contained within the HDF5 file are averaged over N FFTs where N > 1. The pSK value for each channel is calculated by examining M consecutive spectra.
Note: Although the pSK method works reasonable well for short integration times (<~0.3 s) the method may break down for much longer integration times.
Version of plotWaterfall.py for viewing HDF5 files created by hdfWaterfall.py and drspec2hdf.py. This viewer supports all linear and Stokes data products.