Gaps in GHRSST Services for User Applications

Identifying 
Gaps
 in GHRSST services to:
the users and their applications
Prasanjit Dash
1
, Jorge Vazquez
2
, Gary Corlett
3
1. NOAA/CSU CIRA;  2. JPL/Caltech;  3. GHRSST PO/Univ Leicester
1
Producer/agency 
perception
of user expectation
Specification standards
Users’ 
expectations
 of SST
products (if clearly defined)
Identifying 
Gaps
 in GHRSST service delivery: GHRSST service quality
Actual delivery (meeting
specs, routine and incidents)
User 
perception
 of data
(wrong use of right data!)
(1) Knowledge gap
(2) Standards gap
(3) Delivery gap
(4) Communication gap
2
1. Knowledge (awareness) Gap
How well we know our users and what is their
intended application?
-
e.g., ACSPO produces L2 VIIRS@1km, but turns out L4 developers find
L3 to be easier to assimilate. Hence, an L3U has been developed.
-
H-8 AHI L2P SST is produced each 10min, but many users will shy away
from using it because of volume of data or choose an interval. Should
then high temporal resolution be under-sampled to a lower res?
-
Many products may have no users! And sometimes users don’t exactly
find what they need, e.g., gap-free global diurnally resolved SST
Key requirements:
Understand what ‘they’ need and expect (= user expectation)
Reinforces the need of an 
extensive user survey
 and exchange of
information about users, if such a database exists.
3
2. Standards Gap
How to set target accuracy and precision?
-
Follow agency specs; sometimes specs are set by program managers
w/o consulting the producers
-
Ed showed ~77 products at PO.DAAC. Should GHRSST come up with a
specs suggestion?
e.g., before branding a product as ‘GHRSST product’, how to ensure that
certain standards are met? 
(GHRSST can benefit from producers
contribution and can also be  blamed for producers’ inefficiency!)
4
3. Delivery gap
Gap in terms of quality, increased latency in time than expected, corrupt
data etc
-
In case of technical failures, sometime data are corrupt and still make
their way to PO.DAAC and LTSRF.  An ‘innocent’ user will not know of
this which will result in wrong application output.
-
Any QC or warning system, in a timely manner to prevent slipping in
of such data? These are random events are hard to detect in a timely
manner? Any thoughts?
a few examples: next 2 slides
5
3. Delivery gap  … continued … example of corrupt data being archived
(a) NAVO VIIRS L2P Daytime affected by WuCD sensor calibration (and so is ACSPO) 
NAVO VIIRS – CMC, 
A typical day
NAVO VIIRS – CMC, 
affected day
Typical day (expected)
Incident day
6
3. Delivery gap  … continued … example of corrupt data being archived
(b) DMI OISST (system malfunction to decode MODIS data)
DMI – GMPE, A typical day
DMI – GMPE, incident day
Typical day (expected)
Incident day
 
Shift happens
7
4. Communication Gap
User not being fully aware about suitability of data for specific uses
-
Are long-term L3/L4 products necessarily suitable for trend-studies?
Key requirements:
Reinforces the need of 
quick start guide 
to help users choose a datasets
suitable for their intendent applications.
8
Are these ‘
gap
’ concerns valid?
If yes, what do we do about it?
Slide Note
Embed
Share

This study by Prasanjit Dash, Jorge Vazquez, and Gary Corlett highlights knowledge, standards, and delivery gaps in GHRSST services impacting user expectations and applications. It discusses factors like user perception, target accuracy, and data quality, emphasizing the need for effective communication and continuous improvement in service delivery.

  • GHRSST services
  • User applications
  • Data quality
  • Standards gap
  • Service delivery

Uploaded on Oct 04, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Identifying Gaps in GHRSST services to: the users and their applications Prasanjit Dash1, Jorge Vazquez2, Gary Corlett3 1. NOAA/CSU CIRA; 2. JPL/Caltech; 3. GHRSST PO/Univ Leicester 1

  2. Identifying Gaps in GHRSST service delivery: GHRSST service quality Users expectations of SST products (if clearly defined) (1) Knowledge gap Producer/agency perception of user expectation (2) Standards gap Specification standards (3) Delivery gap Actual delivery (meeting specs, routine and incidents) (4) Communication gap User perception of data (wrong use of right data!) 2

  3. 1. Knowledge (awareness) Gap intended application? How well we know our users and what is their - e.g., ACSPO produces L2 VIIRS@1km, but turns out L4 developers find L3 to be easier to assimilate. Hence, an L3U has been developed. - H-8 AHI L2P SST is produced each 10min, but many users will shy away from using it because of volume of data or choose an interval. Should then high temporal resolution be under-sampled to a lower res? - Many products may have no users! And sometimes users don t exactly find what they need, e.g., gap-free global diurnally resolved SST Key requirements: Understand what they need and expect (= user expectation) Reinforces the need of an extensive user survey and exchange of information about users, if such a database exists. 3

  4. 2. Standards Gap How to set target accuracy and precision? - Follow agency specs; sometimes specs are set by program managers w/o consulting the producers - Ed showed ~77 products at PO.DAAC. Should GHRSST come up with a specs suggestion? e.g., before branding a product as GHRSST product , how to ensure that certain standards are met? (GHRSST can benefit from producers contribution and can also be blamed for producers inefficiency!) 4

  5. 3. Delivery gap Gap in terms of quality, increased latency in time than expected, corrupt data etc - In case of technical failures, sometime data are corrupt and still make their way to PO.DAAC and LTSRF. An innocent user will not know of this which will result in wrong application output. - Any QC or warning system, in a timely manner to prevent slipping in of such data? These are random events are hard to detect in a timely manner? Any thoughts? a few examples: next 2 slides 5

  6. 3. Delivery gap continued example of corrupt data being archived (a) NAVO VIIRS L2P Daytime affected by WuCD sensor calibration (and so is ACSPO) NAVO VIIRS CMC, A typical day NAVO VIIRS CMC, affected day 6

  7. 3. Delivery gap continued example of corrupt data being archived (b) DMI OISST (system malfunction to decode MODIS data) DMI GMPE, A typical day DMI GMPE, incident day 7 Shift happens

  8. 4. Communication Gap User not being fully aware about suitability of data for specific uses - Are long-term L3/L4 products necessarily suitable for trend-studies? Key requirements: Reinforces the need of quick start guide to help users choose a datasets suitable for their intendent applications. Are these gap concerns valid? If yes, what do we do about it? 8

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#