Transfer Functions example

Transfer functions are formated in various ways, this module attemps to make reading and writing between the various flavors easier. Every format is read into a common container that can accommodate all metadata and statistical estimates. This module makes no attempts to plot the data or analyze the data, that should be done using MTpy v2.0 (development to incorporate TF is in progress in version 2, version 1 mainly supports EDI files). Here are some examples on how to use this module.

[1]:
from mt_metadata.transfer_functions import TF

Structure of TF

The TF object stores the data as an xarray.DataSet, metadata are stored in a mt_metadata.transfer_functions.tf.Survey object as TF.survey_metadata. By nature TF.survey_metadata contains station metadata, run metadata, and channel metadata.

Note: The mt_metadata.transfer_functions.tf.Survey and mt_metadata.transfer_functions.tf.Station metadata objects are slightly different from the timeseries versions. The main difference is that mt_metadata.transfer_functions.tf.Station as additional information about the transfer function.

[2]:
tf_object = TF()

Survey Metadata

The container for survey metadata is survey_metadata, this includes citation information, project information, and general information about the survey the transfer function was collected for.

[3]:
print(tf_object.survey_metadata.to_json(required=False))
{
    "survey": {
        "acquired_by.author": null,
        "acquired_by.comments": null,
        "acquired_by.organization": null,
        "citation_dataset.authors": null,
        "citation_dataset.doi": null,
        "citation_dataset.journal": null,
        "citation_dataset.pages": null,
        "citation_dataset.title": null,
        "citation_dataset.volume": null,
        "citation_dataset.year": "1980-01-01T00:00:00+00:00",
        "citation_journal.authors": null,
        "citation_journal.doi": null,
        "citation_journal.journal": null,
        "citation_journal.pages": null,
        "citation_journal.title": null,
        "citation_journal.volume": null,
        "citation_journal.year": "1980-01-01T00:00:00+00:00",
        "comments": null,
        "country": null,
        "datum": "WGS84",
        "fdsn.alternate_code": null,
        "fdsn.alternate_network_code": null,
        "fdsn.channel_code": null,
        "fdsn.id": null,
        "fdsn.network": null,
        "fdsn.new_epoch": null,
        "funding_source.comments": null,
        "funding_source.email": null,
        "funding_source.grant_id": null,
        "funding_source.name": null,
        "funding_source.organization": null,
        "funding_source.url": null,
        "geographic_name": null,
        "id": "0",
        "name": null,
        "northwest_corner.latitude": 0.0,
        "northwest_corner.longitude": 0.0,
        "project": null,
        "project_lead.author": null,
        "project_lead.email": null,
        "project_lead.organization": null,
        "release_license": "CC0-1.0",
        "southeast_corner.latitude": 0.0,
        "southeast_corner.longitude": 0.0,
        "state": null,
        "summary": null,
        "time_period.end_date": "1980-01-01",
        "time_period.start_date": "1980-01-01"
    }
}

Station Metadata

The container for station metadata is station_metadata this includes important location information, orientation, provenance, and transfer_function information. This also includes run information and channel information.

[4]:
print(tf_object.station_metadata.to_json(required=False))
{
    "station": {
        "acquired_by.author": null,
        "acquired_by.comments": null,
        "channel_layout": null,
        "channels_recorded": [
            "ex",
            "ey",
            "hx",
            "hy",
            "hz"
        ],
        "comments": null,
        "data_type": "BBMT",
        "doi": null,
        "fdsn.alternate_code": null,
        "fdsn.alternate_network_code": null,
        "fdsn.channel_code": null,
        "fdsn.id": null,
        "fdsn.network": null,
        "fdsn.new_epoch": null,
        "geographic_name": null,
        "id": "0",
        "location.country": null,
        "location.county": null,
        "location.datum": null,
        "location.declination.comments": null,
        "location.declination.epoch": null,
        "location.declination.model": "WMM",
        "location.declination.value": 0.0,
        "location.elevation": 0.0,
        "location.elevation_uncertainty": null,
        "location.latitude": 0.0,
        "location.latitude_uncertainty": null,
        "location.longitude": 0.0,
        "location.longitude_uncertainty": null,
        "location.parcel": null,
        "location.quarter": null,
        "location.section": null,
        "location.state": null,
        "location.township": null,
        "location.x": null,
        "location.x2": null,
        "location.x_uncertainty": null,
        "location.y": null,
        "location.y2": null,
        "location.y_uncertainty": null,
        "location.z": null,
        "location.z2": null,
        "location.z_uncertainty": null,
        "orientation.angle_to_geographic_north": null,
        "orientation.method": null,
        "orientation.reference_frame": "geographic",
        "orientation.value": null,
        "provenance.archive.author": null,
        "provenance.archive.comments": null,
        "provenance.archive.email": null,
        "provenance.archive.name": null,
        "provenance.archive.organization": null,
        "provenance.archive.url": null,
        "provenance.comments": null,
        "provenance.creation_time": "1980-01-01T00:00:00+00:00",
        "provenance.creator.author": null,
        "provenance.creator.comments": null,
        "provenance.creator.email": null,
        "provenance.creator.name": null,
        "provenance.creator.organization": null,
        "provenance.creator.url": null,
        "provenance.log": null,
        "provenance.software.author": null,
        "provenance.software.last_updated": "1980-01-01T00:00:00+00:00",
        "provenance.software.name": null,
        "provenance.software.version": null,
        "provenance.submitter.author": null,
        "provenance.submitter.comments": null,
        "provenance.submitter.email": null,
        "provenance.submitter.name": null,
        "provenance.submitter.organization": null,
        "provenance.submitter.url": null,
        "release_license": "CC0-1.0",
        "run_list": [
            "0"
        ],
        "time_period.end": "1980-01-01T00:00:00+00:00",
        "time_period.start": "1980-01-01T00:00:00+00:00",
        "transfer_function.coordinate_system": "geopgraphic",
        "transfer_function.data_quality.comments": null,
        "transfer_function.data_quality.flag": null,
        "transfer_function.data_quality.good_from_period": null,
        "transfer_function.data_quality.good_to_period": null,
        "transfer_function.data_quality.rating.author": null,
        "transfer_function.data_quality.rating.method": null,
        "transfer_function.data_quality.rating.value": 0,
        "transfer_function.data_quality.warnings": null,
        "transfer_function.id": null,
        "transfer_function.processed_by.author": null,
        "transfer_function.processed_by.comments": null,
        "transfer_function.processed_by.email": null,
        "transfer_function.processed_by.name": null,
        "transfer_function.processed_by.organization": null,
        "transfer_function.processed_by.url": null,
        "transfer_function.processed_date": "1980-01-01",
        "transfer_function.processing_parameters": [],
        "transfer_function.processing_type": null,
        "transfer_function.remote_references": [],
        "transfer_function.runs_processed": [],
        "transfer_function.sign_convention": null,
        "transfer_function.software.author": null,
        "transfer_function.software.last_updated": "1980-01-01T00:00:00+00:00",
        "transfer_function.software.name": null,
        "transfer_function.software.version": null,
        "transfer_function.units": null
    }
}

Data Container

The data container is an xarray.DataSet and convenience methods are included to get/set impedance, tipper, and statistical estimates of errors. This includes covariance estimates like those output by EMTF.

The dataset is setup with input and output coordinates, for the sake of generality the default for input and output channels are ex, ey, hx, hy, hz. Any input/output combo that does not have a value is set to nan. This is mainly for convenience of using xarray.

Input Channels

These are source channels, for natural source MT this will be hx and hy

Output Channels

These are the response channels, for natural source MT this will be ex, ey, and hz.

Channel Nomenclature

If you have data that is recorded in the non-conventional channel labels you can adjust the mapping to the conventional channels using TF.channel_nomenclature and setting the dictionary.

For example:

tf_object.channel_nomenclature = {"ex":"e1", "ey":"e2", "hx":"h1", "hy":"h2"}

Note that the conventional channel names are the keys and the mapping is the value.

[5]:
tf_object.dataset
[5]:
<xarray.Dataset>
Dimensions:                        (period: 1, output: 5, input: 5)
Coordinates:
  * period                         (period) int32 1
  * output                         (output) <U2 'ex' 'ey' 'hx' 'hy' 'hz'
  * input                          (input) <U2 'ex' 'ey' 'hx' 'hy' 'hz'
Data variables:
    transfer_function              (period, output, input) complex128 (nan+na...
    transfer_function_error        (period, output, input) float64 nan ... nan
    transfer_function_model_error  (period, output, input) float64 nan ... nan
    inverse_signal_power           (period, output, input) complex128 (nan+na...
    residual_covariance            (period, output, input) complex128 0j ... 0j
Attributes: (12/14)
    survey:             unknown_survey
    project:            None
    id:                 None
    name:               None
    latitude:           0.0
    longitude:          0.0
    ...                 ...
    datum:              None
    acquired_by:        None
    start:              1980-01-01T00:00:00+00:00
    end:                1980-01-01T00:00:00+00:00
    runs_processed:     [None]
    coordinate_system:  geographic

The dataset also has attributes that are the important information to describe a transfer function and commonly used to make inversion files. These are pulled from station_metadata and survey_metadata.

[6]:
tf_object.station_metadata.id = "mt001"
tf_object.station_metadata.geographic_name = "Long descriptive name"
tf_object.station_metadata.location.latitude = "40:30:10.15"
tf_object.station_metadata.location.longitude = -120.7463
tf_object.station_metadata.location.elevation = 1123
tf_object.station_metadata.location.declination.value = -13.5
tf_object.station_metadata.location.datum = "WGS84"
tf_object.station_metadata.time_period.start = "2020-01-01T00:00:00"
tf_object.station_metadata.time_period.end = "2021-01-01T12:00:00"
tf_object.station_metadata.runs[0].id = "all"
tf_object.station_metadata.acquired_by.author = "MT Master"

tf_object.survey_metadata.project = "Test Project"
tf_object.survey_metadata.id = "CONUS"

tf_object.dataset
[6]:
<xarray.Dataset>
Dimensions:                        (period: 1, output: 5, input: 5)
Coordinates:
  * period                         (period) int32 1
  * output                         (output) <U2 'ex' 'ey' 'hx' 'hy' 'hz'
  * input                          (input) <U2 'ex' 'ey' 'hx' 'hy' 'hz'
Data variables:
    transfer_function              (period, output, input) complex128 (nan+na...
    transfer_function_error        (period, output, input) float64 nan ... nan
    transfer_function_model_error  (period, output, input) float64 nan ... nan
    inverse_signal_power           (period, output, input) complex128 (nan+na...
    residual_covariance            (period, output, input) complex128 0j ... 0j
Attributes: (12/14)
    survey:             CONUS
    project:            Test Project
    id:                 mt001
    name:               Long descriptive name
    latitude:           40.50281944444444
    longitude:          -120.7463
    ...                 ...
    datum:              WGS84
    acquired_by:        MT Master
    start:              2020-01-01T00:00:00+00:00
    end:                2021-01-01T12:00:00+00:00
    runs_processed:     [None]
    coordinate_system:  geographic

Use the convenience function impedance, impedance_error, tipper, tipper_error for accessing the common transfer function estimates. There are also functions for has_ which informs you if that estimate exists.

TF Attributes

[7]:
print(
    "\n\t".join(
        ["Attributes:"]
        + [
            func
            for func in dir(tf_object)
            if not callable(getattr(tf_object, func)) and not func.startswith("_")
        ]
    )
)
Attributes:
        channel_nomenclature
        dataset
        elevation
        ex
        ex_ey
        ex_ey_hz
        ey
        fn
        frequency
        hx
        hx_hy
        hy
        hz
        impedance
        impedance_error
        impedance_model_error
        inverse_signal_power
        latitude
        logger
        longitude
        period
        residual_covariance
        save_dir
        station
        station_metadata
        survey
        survey_metadata
        tf_id
        tipper
        tipper_error
        tipper_model_error
        transfer_function
        transfer_function_error
        transfer_function_model_error

TF Methods

[8]:
print(
    "\n\t".join(
        ["Methods:"]
        + [
            func
            for func in dir(tf_object)
            if callable(getattr(tf_object, func)) and not func.startswith("_")
        ]
    )
)
Methods:
        copy
        from_avg
        from_edi
        from_emtfxml
        from_jfile
        from_ts_station_metadata
        from_zmm
        from_zrr
        from_zss
        has_impedance
        has_inverse_signal_power
        has_residual_covariance
        has_tipper
        has_transfer_function
        read
        to_avg
        to_edi
        to_emtfxml
        to_jfile
        to_ts_station_metadata
        to_zmm
        to_zrr
        to_zss
        write

Important: set the periods before seting any statistical estimate. Otherwise you will get an error that the new estimate is not the same size as the old one and a new TF object should be initiated.

[6]:
import numpy as np

n_periods = 6
tf_object.period = np.logspace(-3, 3, n_periods)

Note: The dataset attributes are propogated through to each statistical estimate for easier book keeping.

[7]:
tf_object.impedance = (
    np.random.randn(n_periods, 2, 2) + np.random.randn(n_periods, 2, 2) * 1j
)
[8]:
tf_object.has_impedance()
[8]:
True
[9]:
tf_object.impedance
[9]:
<xarray.DataArray 'impedance' (period: 6, output: 2, input: 2)>
array([[[ 1.65186583+0.45547987j,  0.67166023+0.21081722j],
        [ 1.5543931 -0.4429966j ,  0.27890193-0.11407583j]],

       [[-0.17313843+0.44364566j, -1.11876361+0.11880249j],
        [-0.09627517-1.25099461j, -0.49165838+0.8148525j ]],

       [[ 1.56998136+0.38673909j, -1.67511252-0.75316144j],
        [ 0.7168005 +1.80394138j, -0.9412986 -0.31452739j]],

       [[ 0.56301996-0.26286159j,  0.37430568+2.23162363j],
        [ 0.78063784+0.14777685j, -1.04882005-1.26269144j]],

       [[ 0.41692771+0.37048142j,  0.73928062+0.81685108j],
        [-0.160713  +2.25626899j,  1.03115796-1.02773488j]],

       [[-1.08757624-0.09338913j,  0.88506534+0.29577059j],
        [ 0.02965095-1.38893543j, -0.71451393-0.39699605j]]])
Coordinates:
  * period   (period) float64 0.001 0.01585 0.2512 3.981 63.1 1e+03
  * output   (output) <U2 'ex' 'ey'
  * input    (input) <U2 'hx' 'hy'
Attributes:
    survey:             0
    project:            None
    id:                 0
    name:               None
    latitude:           0.0
    longitude:          0.0
    elevation:          0.0
    declination:        0.0
    datum:              None
    acquired_by:        None
    start:              1980-01-01T00:00:00+00:00
    end:                1980-01-01T00:00:00+00:00
    runs_processed:     ['0']
    coordinate_system:  geographic
[10]:
tf_object.dataset
[10]:
<xarray.Dataset>
Dimensions:                        (period: 6, output: 5, input: 5)
Coordinates:
  * period                         (period) float64 0.001 0.01585 ... 63.1 1e+03
  * output                         (output) <U2 'ex' 'ey' 'hx' 'hy' 'hz'
  * input                          (input) <U2 'ex' 'ey' 'hx' 'hy' 'hz'
Data variables:
    transfer_function              (period, output, input) complex128 (nan+na...
    transfer_function_error        (period, output, input) float64 nan ... nan
    transfer_function_model_error  (period, output, input) float64 nan ... nan
    inverse_signal_power           (period, output, input) complex128 (nan+na...
    residual_covariance            (period, output, input) complex128 0j ... 0j
Attributes: (12/14)
    survey:             0
    project:            None
    id:                 0
    name:               None
    latitude:           0.0
    longitude:          0.0
    ...                 ...
    datum:              None
    acquired_by:        None
    start:              1980-01-01T00:00:00+00:00
    end:                1980-01-01T00:00:00+00:00
    runs_processed:     ['0']
    coordinate_system:  geographic

Get impedance element

We can use xarray type indexing to get at elements. Here we are requesting the “Zyx” component and just the first element.

[11]:
tf_object.impedance.loc[dict(input="hx", output="ey")][0]
[11]:
<xarray.DataArray 'impedance' ()>
array(1.5543931-0.4429966j)
Coordinates:
    period   float64 0.001
    output   <U2 'ey'
    input    <U2 'hx'
Attributes:
    survey:             0
    project:            None
    id:                 0
    name:               None
    latitude:           0.0
    longitude:          0.0
    elevation:          0.0
    declination:        0.0
    datum:              None
    acquired_by:        None
    start:              1980-01-01T00:00:00+00:00
    end:                1980-01-01T00:00:00+00:00
    runs_processed:     ['0']
    coordinate_system:  geographic

Reading and Writing

Reading and writing are done through the methods read and write. To write a file you must pass the new file name to be written. If the extension is provided then the appropriate writer will be used, or you an spcify the writer using file_type.

[12]:
help(tf_object.write)
Help on method write in module mt_metadata.transfer_functions.core:

write(fn=None, save_dir=None, fn_basename=None, file_type='edi', **kwargs) method of mt_metadata.transfer_functions.core.TF instance
    Write an mt file, the supported file types are EDI and XML.

    .. todo:: j-files and avg files

    :param fn: full path to file to save to
    :type fn: :class:`pathlib.Path` or string

    :param save_dir: full path save directory
    :type save_dir: string

    :param fn_basename: name of file with or without extension
    :type fn_basename: string

    :param file_type: [ 'edi' | 'xml' | "zmm" ]
    :type file_type: string

    keyword arguments include

    :param longitude_format:  whether to write longitude as longitude or LONG.
                              options are 'longitude' or 'LONG', default 'longitude'
    :type longitude_format:  string
    :param latlon_format:  format of latitude and longitude in output edi,
                           degrees minutes seconds ('dms') or decimal
                           degrees ('dd')
    :type latlon_format:  string

    :returns: full path to file
    :rtype: string

    :Example: ::

        >>> tf_obj.write(file_type='xml')

[13]:
help(tf_object.read)
Help on method read in module mt_metadata.transfer_functions.core:

read(fn=None, file_type=None, get_elevation=True, **kwargs) method of mt_metadata.transfer_functions.core.TF instance
    Read an TF response file.

    .. note:: Currently only .edi, .xml, .j, .zmm/rr/ss, .avg
       files are supported



    :param fn: full path to input file
    :type fn: string

    :param file_type: ['edi' | 'j' | 'xml' | 'avg' | 'zmm' | 'zrr' | 'zss' | ... ]
                      if None, automatically detects file type by
                      the extension.
    :type file_type: string
    :param get_elevation: Get elevation from US National Map DEM
    :type get_elevation: bool

    :Example: ::

        >>> import mt_metadata.transfer_functions import TF
        >>> tf_obj = TF()
        >>> tf_obj.read(fn=r"/home/mt/mt01.xml")

    .. note:: If your internet is slow try setting 'get_elevation' = False,
     It can get hooked in a slow loop and slow down reading.

[ ]: