Introduction to Apache Pig in Distributed Computing

AQMS utilities
(python, sql, etc scripts)
NetOps IX Workshop, University of Washington, March 20-21, 2018
Glenn Biasi, USGS, Facilitator
What AQMS utility do you use most often?
dbselect (9 mentions)
perl scripts (2)
python scripts (3)
sql (3)
shell scripts (3)
pretty even mix of all of them
named others:
stp, jiggle
We use our own utilities mostly, but use Ellen's dbcleanup scripts monthly.
What utility functions do you wish could be
added?
Station UI
Event-based ways to assess station performance / contribution
Import metadata through SIS
Centralized configuration GUI for alarms
Efficient import of hypocenters/magnitudes from neighbor networks
Analysis of channel mags for mag residuals
I have a long wishlist of features.
1) ability in Jiggle to see past versions of events and revert to them if desired;
2) ability in Jiggle and/or dbselect to search for events that have comments and for keywords within
comments;
3) ability in Jiggle to easily add pick times for stations not in the database and test how they effect the
solutions;
4) ability in Jiggle to see event submissions from other networks so you can compare solutions side-by-side;
5) ability in Jiggle's map to plot lines from the epicenter to the stations used in that solution (as a means to
quickly gauge azimuthal coverage and prioritize picking);
6) ability in Jiggle to easily toggle between servers (a button to change all affected fields in the jiggle props)
Locally developed utility or feature that might
be helpful or of interest to other networks?
managing configuration files with subversion
station metadata loader
"swarm alarm" script, a dbselect query that looks for events
concentrated in area and time.  Over a pre-set threshold, it emails us.
CEUS contributions (next two slides)
Scripting Landscape, CEUS - Memphis
db2cat, output a catalog line for evid (in Finalize)
db2evdir, output an event directory for evid (in Finalize)
dbselect, get parametric information for evid
epidist2, find closest city from summary line
findcatmatch, find a match in NewMad2k.cat for event
makemessage2, make a text file (e.g. report) for evid
rmtrace, remove specified trace from event
stp, get seismograms from archdb
Next page for off-line analysis
Tools Supporting Off-line Analysis, Memphis
rsa keys and the oracle instantclient allow the analyst to run these scripts on an offline analysis machine
(vilos) and remotely connect to the production postproc system.   Offline analysis tools:
chmag, set magnitude for evid
chvelmod, use a fixed velocity model
db2qml, send quakeml to PDL for evid (done automatically with Finalize)
eqtweet, create and send a tweet for evid; and notweet, disable eqtweet.
etype, set etype in archdb for evid
ev2db, load event directory into archdb
newtrig, create a new trigger in archdb from sac dir
newtrig2, create a new trigger in archdb from sac dir using station name for lat/lon.
nuke_dups, remove duplicate entries in the waveform table.
nuke_dups_archive, remove duplicate entries in the waveform table for years prior to current.
sac2db, load sac files into archdb for evid
sac2db_archive, load sac files into archdb for evid from a year prior to present
startdeep, set starting depth to 25 km
Tools Supporting Off-line Analysis, Memphis
rsa keys and the oracle instantclient allow the analyst to run these scripts on an offline
analysis machine (vilos) and remotely connect to the production postproc system.   Offline
analysis tools:
chmag, set magnitude for evid
chvelmod, use a fixed velocity model
db2qml, send quakeml to PDL for evid (done automatically with Finalize)
eqtweet, create and send a tweet for evid; and notweet, disable eqtweet.
etype, set etype in archdb for evid
ev2db, load event directory into archdb
newtrig, create a new trigger in archdb from sac dir
newtrig2, create a new trigger in archdb from sac dir using station name for lat/lon.
nuke_dups, remove duplicate entries in the waveform table.
nuke_dups_archive, remove duplicate entries in the waveform table for years prior to current.
sac2db, load sac files into archdb for evid
sac2db_archive, load sac files into archdb for evid from a year prior to present
startdeep, set starting depth to 25 km
Patterns:
Scripting evolves to address small needs, solve problems
Low thresholds for using the database promote its use.
SCSN Utilities and Scripts
Main tool set:  tpp scripts
http://vault.gps.caltech.edu/trac/cisn/browser#PP/trunk/perl_utils
Wiki document about them here:
http://vault.gps.caltech.edu/trac/cisn/wiki/PerlUtils
 --
Role switch scripts:
http
://scsnwiki.gps.caltech.edu/doku.php?id=rtem:fail_over&s[]=
takeover
Bourne shell script changing roles of primary and secondary real-time systems
Includes checklist of things to check:  event generation, duty review, alarm, TMTS2
Another useful utility we have is the "qml" script:
  
http://vault.gps.caltech.edu/trac/cisn/browser/RT/branches/linux-dev-CI-
branch/alarming/contrib/bin/qml
Generate quakeML from AQMS database
Used to submit to ComCat.
Perl (2800 lines!)
tpp scripts, Event Info
Event Info 
scripts reflect a lot of work
Arguments are not intuitive and require the user to remember a lot.
Outputs are brittle; have to edit the code or write a postprocessor to do things as simple as
change column order of output.
Code behavior is inconsistent in things like headers.
tpp scripts, Event Parameter Info
Specialized queries for phase, amplitude, and waveform archive information
tpp scripts, Station/Channel Info
Crafted SQL calls, outputs to specific formats.
tpp scripts, Duty Review Page Support
tpp scripts, Process Control System
specialized scripting for process tracking and control
tpp scripts, Post Processing Tasks
tpp scripts, Miscellaneous Utilities
What is Next For AQMS Utility Development?
Balance between turn-key scripts in AQMS distro vs. tools to craft
your own?
Need/usefulness of greater standardization?
Priorities for AQMS utility development?
tpp: cattail.pl
usage: cattail.pl [option-list] [[#interval] or [startDate [endDate]]]
    options: [-h] [-c] [-d dbase] [-a or [ [-D] [-B0|1]]] [-i|I]
             [-f file] [-e types] [-g|G gtypes] [-p rflags] [-m mag -M mag]
             [ -mt magtypes] [-ma magalgo]
             [-mad rval -MAD rval] [-rms rval -RMS rval]
             [-mz] [-z depth -Z depth] [ -fz]
             [-es esrc -os osrc -ms msrc ] -[2rwL]
             [-u d|w|m ]
    Writes a summary list for  events in the past 'N' hours (default = 24), or
specified date range
.  45 command-line args.
tpp:  trigtail.pl
usage: trigtail.pl [-d dbase] [-h] [-aqr] <#hrs-back> (default = $hrsback)
    Print a listing summarizing the subnet trigger events (etype=st) that
occurred in the past '#hrs-back' hours.
    -d dbase  : alias to use for db connection (defaults to $masterdb)
    -a        : include all events including deleted and duplicates
    -q        : do not print column header title
    -r        : reverse sort order (newest at top)
    example: $0 -d mydb 24
tpp: eventhist.pl
usage: eventhist.pl [-q] [-d dbase]  <evid>
    Show origin,magnitude change history info for an event.
    List is sorted in magnitude lddate descending order.
    (The most recent update first, oldest last).
example difficulty:
    Event prefmag is flagged by "<+++" after its lddate, otherwise
    if its the preferred of a magtype its flagged by  "+" after.
Example problem:
    When the orid is event prefor and the magid is the event prefmag the both the waveform count
and event version numbers are printed.
So to use eventhist.pl, we have to remember the special output cases, the odd flagging
convention, and then write another parser to get the extra stuff out.
tpp: catone.pl
usage: catone.pl -[ht] [-d dbase] <event-id>
    Write a summary line for one event.
    -h        : print usage info
    -t        : write column header
    -d dbase  : alias to use for db connection (defaults to "$masterdb")
    example: catone.pl -d k2db -t 7395710
tpp:  eventAge.pl
usage: eventAge.pl [-d dbase] <evid>
    Prints total integer seconds elapsed since event origin time, or "0"
if the input event evid does not match an event row in db.
    -h        : print usage info
    -d dbase  : alias to use for db connection (defaults to "$masterdb”)
    example: $0 -d k2db 8735346
Implicit read of Event hash (database hardcode)
tpp:  eventMag.pl
 Return string value for the preferred magnitude of the event in db associated with input evid
# options to prepend/append magtype,evid,auth, and version.
# Handles rounding to nearest 0.1 unit.
usage: eventMag.pl [-d dbase] [-i] [-t] <evid>
    Print the magnitude value of an event.
    -h        : print this usage info    -d dbase  : alias to use for db connection (defaults to "$masterdb")
    -a        : append the event auth after evid for the event info string;   -v        : append the event auth and its version after evid for
the event info string
    -i        : prepend an event info string before the magnitude value;   -I        : append an event info string after the magnitude value
    -l        : format origin as local time and date;   -L        : format origin as date and local time
    -o        : origin date time appended, default is UTC with date first;  -O  : origin date time prepended, default is UTC with date first
    -p        : magnitude in parentheses
    -t        : prepend the magnitude type before the magnitude value;  -T        : append the magnitude type after the magnitude value
  -u        : format origin time as UTC time and date  -U        : format origin time as UTC date and time
tpp:  showalarms.pl
usage: showalarms.pl [-d dbase] [evid]
    Print a list of alarms generated by one or more an events.
    Data are parsed from rows of the ALARM_ACTION table.
    -h        : print this usage info
    -d dbase  : alias to use for db connection (otherwise does both
"$masterrt" and "$masterdb")
    -e        : only list alarms whose action state is ERROR
    -t days   : when no evid argument, the #days back MOD_TIME for all
evids in table (default: $daysAgo = 7)
tpp:  dbinfo
masterdblookup.pl:
Returns name of DB database that is PRIMARY by querying the RT dbases.
Info also available from MasterDbs.pm
masterrtlookup.pl:
Returns name of RT database that is PRIMARY by querying the RT dbases.
Info also available from MasterDbs.pm
refreshDBInfo.pl
Keep the file (cfg/masterinfo.cfg) containing info about master db's current with
master/backup info. This file is used by ALL other scripts that need to know about master
dbases.
setupTransitions.pl
 setupTransitions.pl [ -d dbname, default: "$masterdb"] [ -f cfgFile evaluated with rules]
    Set up the standard PCS_TRANSITION table rules. Reads rules from file and inserts into table of
connected database.
tpp:  hypophases.pl
Dump phase info for one event in Hypoinverse2000 .ARC format
Prints a hypoinverse ARC listing of the phase arrival data associated
with the prefor of the event, unless using -o option, in which case
the  input id is origin orid.
242 L
ampdump, ampdumpo, ampdumpsm
Print amp data associated with preferred magnitude of event
(AssocAmM), or with input magid if -m option is used.  formats sql
call, organizes return
Print amp data associated with the preferred origin of event in
(AssocAmO). Or input origin id with -o option.  formats sql call,
organizes return
Print strong motion amp data, or total amp count, associated with
event (AssocEvAmpset).  formats sql call, organizes return
tpp:  checkwavefiles.pl
usage: checkwavefiles.pl [-d dbase] [-1  or -wW wavefilehost] <evid>
    Print the waveform file path found in database for an event evid.
    Optionally lists path files found on a specified wavefile host using
ssh.
tpp:  wavedump.pl
usage: wavedump.pl [-h] [-c|k] [-C] [-d dbase] [-q] [-f] [-r] [-R] <evid>
    Prints summary info about waveforms and/or waveform requests
associated with event.
    Waveform summary info: SNCL, ontime, start-time, end-time,
duration-secs, total-bytes, sps
tpp:  hypostalist
usage: hypostalist [-abd:qX:x:I:i:h] [<start-date>] [<end-date>]
Dump station list from CHANNEL_DATA table in Hypoinverse format
10 command line options to adjust output choices
Crafted SQL with some hardcoding
tpp:  makeadhoc
Lists info for SCEDC seismic stations in AdHoc list format.
    Includes only those seedchan matching the wildcards: E% or H%,
for nets: 'CI', 'AZ', 'FA', 'TA', 'EN', 'SB', 'ZY', 'BC', 'NN', excluding sta:
'MIK' and 'CBC'.
    usage: makeadhoc [-d dbase]
crafted SQL call, formatted output.
tpp:  stainfo
usage: stainfo -[ahqt] [-d dbase] [-f type] <station string>
Prints summary station info ordered by sta, net, location, seedchan.
All formats include lat,lon, and elevation. Default includes channel
ondate, offdate, and site name.
Optional formats for output
tpp:  stadist
usage: stadist  -[h] [-d dbase] <lat.dec> <lon.dec>
           stadist  -[h] [-d dbase] <latdeg> <latmin> <londeg> <lonmin>
    Dump a station list in distance order from this lat/lon.
    Only those stations eligible for waveform archiving (rcg) are listed.
Based on dbase connection and rule for archiving.
-> specialized sql select, purpose-formatted output.
Slide Note
Embed
Share

Apache Pig is a powerful tool developed by Yahoo! under the Apache project for Hadoop-based distributed computing. It simplifies data processing tasks by using Pig Latin, a dataflow language. Through Pig, users can perform data summarization, ad-hoc reporting, querying, and analysis on large datasets. The tool supports various operations like grouping, joining, filtering, and aggregation, making it highly versatile for handling big data. With Pig, users can execute scripts locally or on a Hadoop cluster using different modes such as local mode, MapReduce mode, and interactive mode.

  • Apache Pig
  • Distributed Computing
  • Big Data
  • Data Processing
  • Hadoop

Uploaded on Feb 18, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. AQMS utilities (python, sql, etc scripts) NetOps IX Workshop, University of Washington, March 20-21, 2018 Glenn Biasi, USGS, Facilitator

  2. What AQMS utility do you use most often? dbselect (9 mentions) perl scripts (2) python scripts (3) sql (3) shell scripts (3) pretty even mix of all of them named others: stp, jiggle We use our own utilities mostly, but use Ellen's dbcleanup scripts monthly.

  3. What utility functions do you wish could be added? Station UI Event-based ways to assess station performance / contribution Import metadata through SIS Centralized configuration GUI for alarms Efficient import of hypocenters/magnitudes from neighbor networks Analysis of channel mags for mag residuals I have a long wishlist of features. 1) ability in Jiggle to see past versions of events and revert to them if desired; 2) ability in Jiggle and/or dbselect to search for events that have comments and for keywords within comments; 3) ability in Jiggle to easily add pick times for stations not in the database and test how they effect the solutions; 4) ability in Jiggle to see event submissions from other networks so you can compare solutions side-by-side; 5) ability in Jiggle's map to plot lines from the epicenter to the stations used in that solution (as a means to quickly gauge azimuthal coverage and prioritize picking); 6) ability in Jiggle to easily toggle between servers (a button to change all affected fields in the jiggle props)

  4. Locally developed utility or feature that might be helpful or of interest to other networks? managing configuration files with subversion station metadata loader "swarm alarm" script, a dbselect query that looks for events concentrated in area and time. Over a pre-set threshold, it emails us. CEUS contributions (next two slides)

  5. Scripting Landscape, CEUS - Memphis db2cat, output a catalog line for evid (in Finalize) db2evdir, output an event directory for evid (in Finalize) dbselect, get parametric information for evid epidist2, find closest city from summary line findcatmatch, find a match in NewMad2k.cat for event makemessage2, make a text file (e.g. report) for evid rmtrace, remove specified trace from event stp, get seismograms from archdb Next page for off-line analysis

  6. Tools Supporting Off-line Analysis, Memphis rsa keys and the oracle instantclient allow the analyst to run these scripts on an offline analysis machine (vilos) and remotely connect to the production postproc system. Offline analysis tools: chmag, set magnitude for evid chvelmod, use a fixed velocity model db2qml, send quakeml to PDL for evid (done automatically with Finalize) eqtweet, create and send a tweet for evid; and notweet, disable eqtweet. etype, set etype in archdb for evid ev2db, load event directory into archdb newtrig, create a new trigger in archdb from sac dir newtrig2, create a new trigger in archdb from sac dir using station name for lat/lon. nuke_dups, remove duplicate entries in the waveform table. nuke_dups_archive, remove duplicate entries in the waveform table for years prior to current. sac2db, load sac files into archdb for evid sac2db_archive, load sac files into archdb for evid from a year prior to present startdeep, set starting depth to 25 km

  7. Tools Supporting Off-line Analysis, Memphis Patterns: Scripting evolves to address small needs, solve problems Low thresholds for using the database promote its use. rsa keys and the oracle instantclient allow the analyst to run these scripts on an offline analysis machine (vilos) and remotely connect to the production postproc system. Offline analysis tools: chmag, set magnitude for evid chvelmod, use a fixed velocity model db2qml, send quakeml to PDL for evid (done automatically with Finalize) eqtweet, create and send a tweet for evid; and notweet, disable eqtweet. etype, set etype in archdb for evid ev2db, load event directory into archdb newtrig, create a new trigger in archdb from sac dir newtrig2, create a new trigger in archdb from sac dir using station name for lat/lon. nuke_dups, remove duplicate entries in the waveform table. nuke_dups_archive, remove duplicate entries in the waveform table for years prior to current. sac2db, load sac files into archdb for evid sac2db_archive, load sac files into archdb for evid from a year prior to present startdeep, set starting depth to 25 km

  8. SCSN Utilities and Scripts Main tool set: tpp scripts http://vault.gps.caltech.edu/trac/cisn/browser#PP/trunk/perl_utils Wiki document about them here: http://vault.gps.caltech.edu/trac/cisn/wiki/PerlUtils -- Role switch scripts: http://scsnwiki.gps.caltech.edu/doku.php?id=rtem:fail_over&s[]=takeover Bourne shell script changing roles of primary and secondary real-time systems Includes checklist of things to check: event generation, duty review, alarm, TMTS2 Another useful utility we have is the "qml" script: http://vault.gps.caltech.edu/trac/cisn/browser/RT/branches/linux-dev-CI- branch/alarming/contrib/bin/qml Generate quakeML from AQMS database Used to submit to ComCat. Perl (2800 lines!)

  9. tpp scripts, Event Info Command Function cattail dump events since <hours-back>, or in a starttime::endtime window; 45 command line args trigtail dump triggers since <hours-back> eventhist show the event's history catone output summary line for one event eventAge output event age in seconds eventMag output event mag showalarms Print a list of alarms generated by one or more an events. Data are parsed from rows of the ALARM_ACTION table. Event Info scripts reflect a lot of work Arguments are not intuitive and require the user to remember a lot. Outputs are brittle; have to edit the code or write a postprocessor to do things as simple as change column order of output. Code behavior is inconsistent in things like headers.

  10. tpp scripts, Event Parameter Info Command Function hypophases dump phases in Hypo2000 format ampdump dump amps contributing to mag (AssocAmM) ampdumpo dump amps associated with origin (AssocAmO) ampdumpsm dump strong motion station amps associated with event checkwavefiles show dir and list archived waveform files for event (requires ssh) wavedump dump list of this event's waveforms Specialized queries for phase, amplitude, and waveform archive information

  11. tpp scripts, Station/Channel Info Command Function hypostalist dump station list in Hypo200 format; flags to adjust output for time and type subsets makeadhoc dump station list in 'adhoc' format stainfo dump station info - string like ""NC.SJE.HHZ.--"", ""CE.%.hn_.0%"" types: ""hypo"", ""kml"", ""gmt"", ""adhoc"" or default" Specialized station info fetch and format stadist sort current stations from point - also: <lat.dec> <lon.dec>. waveform archiving support call Crafted SQL calls, outputs to specific formats.

  12. tpp scripts, Duty Review Page Support Command Function acceptEvent mark event as reviewed acceptTrigger mark trigger as reviewed deleteEvent delete event or trigger undeleteEvent undelete event or trigger alarmSend send alarms alarmCancel cancel alarms finalizeEvent finalize the event

  13. tpp scripts, Process Control System Command Function puttrans create transition definition deltrans delete a transition definition gettrans show transition definitions from PCS_TRANSITION database table getstates List all postings in PCS_STATE table whose state description matches input arguments post post event <id> to <group> <table> <state> <rank> in the PCS system example: post -d mydb EventStream archdb MakeGif 100 postFromList post events listed in a file to a state unpost remove an event from all states delstates Delete all events with the given signature next Get the next event in this state result set event's result value specialized scripting for process tracking and control

  14. tpp scripts, Post Processing Tasks Command Function ampgenpp create amps and write to database Calculate ground motions for one event. Does: PGD, PGV, PGA and spectral amps Runs the AmpGenPp application found in jiggle jar rcgone Generate request cards for one event

  15. tpp scripts, Miscellaneous Utilities Command Function masterrt output current master RT system name masterdb output current master database name showlocks show events locked by Jiggle unlockevent unlock an event locked by Jiggle unlockall unlock all events locked by Jiggle epoch show system time in epoch seconds hypolocate locate this event with a [remote] SolServer

  16. What is Next For AQMS Utility Development? Balance between turn-key scripts in AQMS distro vs. tools to craft your own? Need/usefulness of greater standardization? Priorities for AQMS utility development?

  17. tpp: cattail.pl usage: cattail.pl [option-list] [[#interval] or [startDate [endDate]]] options: [-h] [-c] [-d dbase] [-a or [ [-D] [-B0|1]]] [-i|I] [-f file] [-e types] [-g|G gtypes] [-p rflags] [-m mag -M mag] [ -mt magtypes] [-ma magalgo] [-mad rval -MAD rval] [-rms rval -RMS rval] [-mz] [-z depth -Z depth] [ -fz] [-es esrc -os osrc -ms msrc ] -[2rwL] [-u d|w|m ] Writes a summary list for events in the past 'N' hours (default = 24), or specified date range. 45 command-line args.

  18. tpp: trigtail.pl usage: trigtail.pl [-d dbase] [-h] [-aqr] <#hrs-back> (default = $hrsback) Print a listing summarizing the subnet trigger events (etype=st) that occurred in the past '#hrs-back' hours. -d dbase : alias to use for db connection (defaults to $masterdb) -a : include all events including deleted and duplicates -q : do not print column header title -r : reverse sort order (newest at top) example: $0 -d mydb 24

  19. tpp: eventhist.pl usage: eventhist.pl [-q] [-d dbase] <evid> Show origin,magnitude change history info for an event. List is sorted in magnitude lddate descending order. (The most recent update first, oldest last). example difficulty: Event prefmag is flagged by "<+++" after its lddate, otherwise if its the preferred of a magtype its flagged by "+" after. Example problem: When the orid is event prefor and the magid is the event prefmag the both the waveform count and event version numbers are printed. So to use eventhist.pl, we have to remember the special output cases, the odd flagging convention, and then write another parser to get the extra stuff out.

  20. tpp: catone.pl usage: catone.pl -[ht] [-d dbase] <event-id> Write a summary line for one event. -h : print usage info -t : write column header -d dbase : alias to use for db connection (defaults to "$masterdb") example: catone.pl -d k2db -t 7395710

  21. tpp: eventAge.pl usage: eventAge.pl [-d dbase] <evid> Prints total integer seconds elapsed since event origin time, or "0" if the input event evid does not match an event row in db. -h : print usage info -d dbase : alias to use for db connection (defaults to "$masterdb ) example: $0 -d k2db 8735346 Implicit read of Event hash (database hardcode)

  22. tpp: eventMag.pl # options to prepend/append magtype,evid,auth, and version. # Handles rounding to nearest 0.1 unit. usage: eventMag.pl [-d dbase] [-i] [-t] <evid> Print the magnitude value of an event. -h : print this usage info -d dbase : alias to use for db connection (defaults to "$masterdb") Return string value for the preferred magnitude of the event in db associated with input evid -a : append the event auth after evid for the event info string; -v : append the event auth and its version after evid for the event info string -i : prepend an event info string before the magnitude value; -I : append an event info string after the magnitude value -l : format origin as local time and date; -L : format origin as date and local time -o : origin date time appended, default is UTC with date first; -O : origin date time prepended, default is UTC with date first -p : magnitude in parentheses -t : prepend the magnitude type before the magnitude value; -T : append the magnitude type after the magnitude value -u : format origin time as UTC time and date -U : format origin time as UTC date and time

  23. tpp: showalarms.pl usage: showalarms.pl [-d dbase] [evid] Print a list of alarms generated by one or more an events. Data are parsed from rows of the ALARM_ACTION table. -h : print this usage info -d dbase : alias to use for db connection (otherwise does both "$masterrt" and "$masterdb") -e : only list alarms whose action state is ERROR -t days : when no evid argument, the #days back MOD_TIME for all evids in table (default: $daysAgo = 7)

  24. tpp: dbinfo masterdblookup.pl: Returns name of DB database that is PRIMARY by querying the RT dbases. Info also available from MasterDbs.pm masterrtlookup.pl: Returns name of RT database that is PRIMARY by querying the RT dbases. Info also available from MasterDbs.pm refreshDBInfo.pl Keep the file (cfg/masterinfo.cfg) containing info about master db's current with master/backup info. This file is used by ALL other scripts that need to know about master dbases. setupTransitions.pl setupTransitions.pl [ -d dbname, default: "$masterdb"] [ -f cfgFile evaluated with rules] Set up the standard PCS_TRANSITION table rules. Reads rules from file and inserts into table of connected database.

  25. tpp: hypophases.pl Dump phase info for one event in Hypoinverse2000 .ARC format Prints a hypoinverse ARC listing of the phase arrival data associated with the prefor of the event, unless using -o option, in which case the input id is origin orid. 242 L

  26. ampdump, ampdumpo, ampdumpsm Print amp data associated with preferred magnitude of event (AssocAmM), or with input magid if -m option is used. formats sql call, organizes return Print amp data associated with the preferred origin of event in (AssocAmO). Or input origin id with -o option. formats sql call, organizes return Print strong motion amp data, or total amp count, associated with event (AssocEvAmpset). formats sql call, organizes return

  27. tpp: checkwavefiles.pl usage: checkwavefiles.pl [-d dbase] [-1 or -wW wavefilehost] <evid> Print the waveform file path found in database for an event evid. Optionally lists path files found on a specified wavefile host using ssh.

  28. tpp: wavedump.pl usage: wavedump.pl [-h] [-c|k] [-C] [-d dbase] [-q] [-f] [-r] [-R] <evid> Prints summary info about waveforms and/or waveform requests associated with event. Waveform summary info: SNCL, ontime, start-time, end-time, duration-secs, total-bytes, sps

  29. tpp: hypostalist usage: hypostalist [-abd:qX:x:I:i:h] [<start-date>] [<end-date>] Dump station list from CHANNEL_DATA table in Hypoinverse format 10 command line options to adjust output choices Crafted SQL with some hardcoding

  30. tpp: makeadhoc Lists info for SCEDC seismic stations in AdHoc list format. Includes only those seedchan matching the wildcards: E% or H%, for nets: 'CI', 'AZ', 'FA', 'TA', 'EN', 'SB', 'ZY', 'BC', 'NN', excluding sta: 'MIK' and 'CBC'. usage: makeadhoc [-d dbase] crafted SQL call, formatted output.

  31. tpp: stainfo usage: stainfo -[ahqt] [-d dbase] [-f type] <station string> Prints summary station info ordered by sta, net, location, seedchan. All formats include lat,lon, and elevation. Default includes channel ondate, offdate, and site name. Optional formats for output

  32. tpp: stadist usage: stadist -[h] [-d dbase] <lat.dec> <lon.dec> stadist -[h] [-d dbase] <latdeg> <latmin> <londeg> <lonmin> Dump a station list in distance order from this lat/lon. Only those stations eligible for waveform archiving (rcg) are listed. Based on dbase connection and rule for archiving. -> specialized sql select, purpose-formatted output.

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#