locked WSPR Spot validation #WSPR


Roland
 

It seems many decoded Spots contain wrong information like TX locator for example. There is no Data validation in the whole chain which leads to problems if you are into Data Analytics.



My proposal is to validate decoded locators against 3rd party DBs like qrz.com etc. on the client-side.Something like this

Compare decoded locator with 3rd party databases (qrz.com etc.) match callsign and locator if no match then discard - no upload, if call/locator is not registered in 3rd party DB accept Spot and upload.

I think this would eliminate a lot of wrong Spots like this one for example.



GM1BAN is registered on qrz.com and his locator is nowhere near the South Atlantic.

What do you think?
73
Roland


Bill Somerville
 

On 03/11/2021 12:22, Roland wrote:
It seems many decoded Spots contain wrong information like TX locator for example. There is no Data validation in the whole chain which leads to problems if you are into Data Analytics.
Hi Roland,

do you have evidence that the stations you mention are sending the wrong locator in their WSPR beacon transmissions? It may be that the invalid location information is being added in one of downstream databases.

73
Bill
G4WJS.


Roland
 

Hi Bill,

I am the founder of the Intl. WSPR Beacon Project, GM1BAN is one of our Beacons. Here are his transmissions Link 
That is just one example. I don't know how many there are in total, difficult to quantify exactly.

Cheers
Roland


Tom V. Segalstad
 

 

We see from time to time that wrong decoding of weak signals are plotted on DXMAPS.COM (and other maps). But I assume that such wild decoding and reporting cannot be avoided?

 

Some examples of wrong FT8 decoding on 50.313 MHz here during the last week (WSJT-X v. 2.5.0):

 

181130 -24  2.0 1118 ~  FT0XHT 5R9KHX JP91

 

104745 -24  2.3 2951 ~  7I7IBM/P QF6GOM/P R KP02

 

122845 -20  4.0  899 ~  PT2KFL/R IP0RT PC08

 

122115 -19  4.1  210 ~  M52FPI/P TX5SPH BQ51

 

73 from Tom, LA4LN

 

 

Fra: Bill Somerville
Sendt: onsdag 3. november 2021 kl. 13.33
Til: main@WSJTX.groups.io
Emne: Re: [WSJTX] WSPR Spot validation #WSPR

 

On 03/11/2021 12:22, Roland wrote:
> It seems many decoded Spots contain wrong information like TX locator
> for example. There is no Data validation in the whole chain which
> leads to problems if you are into Data Analytics.

Hi Roland,

do you have evidence that the stations you mention are sending the wrong
locator in their WSPR beacon transmissions? It may be that the invalid
location information is being added in one of downstream databases.

73
Bill
G4WJS.

 


--
Tom (LA4LN)


Roland
 

The question I have is can we validate some of the Spot data like the locator before it gets uploaded? As explained in my opening post?


Bill Somerville
 

On 03/11/2021 12:38, Roland wrote:
Hi Bill,

I am the founder of the Intl. WSPR Beacon Project, GM1BAN is one of our Beacons. Here are his transmissions Link 
That is just one example. I don't know how many there are in total, difficult to quantify exactly.

Cheers
Roland

Hi Roland,

is GM1BAM sending a 6-character grid locator? If it is you could reduce the likelihood of false decodes by using a 4-character grid square or checking the "Prefer Type 1 Messages" option which ensures only 4-character grids are sent.

73
Bill
G4WJS.


Roland
 

Biil, GM1BAN is only sending 4 character locator as all of our Beacons


Bill Somerville
 

On 03/11/2021 12:56, Roland wrote:
Biil, GM1BAN is only sending 4 character locator as all of our Beacons
Hi Roland,

that was not clear from the link you sent when I first asked, hi.

Many of the large volume WSPR spotting sources are not using WSJT-X so I guess they are forming spots from their own decoders, or more likely from running the underlying wsprd decoder application. Do you know if these bad spots are more prevalent from particular types of spot sources?

73
Bill
G4WJS.


Michael Black
 

It could be filtered with a QRZ lookup.
I have a filter for the JTAlert email interface that does a QRZ lookup to avoid bad decodes like this and uses it's own local cache to reduce the QRZ queries to almost zero.
I also put the filter in my spot filtering program that sits between the cluster and Log4OM to avoid Log4OM triggering on false decodes too.

But with the spot reporting being done internally in WSJTX the QRZ validation would have to be added internally unless we migrated it externally.

Mike W9MDB




On Wednesday, November 3, 2021, 07:40:27 AM CDT, Tom V. Segalstad <la4ln@...> wrote:


 

We see from time to time that wrong decoding of weak signals are plotted on DXMAPS.COM (and other maps). But I assume that such wild decoding and reporting cannot be avoided?

 

Some examples of wrong FT8 decoding on 50.313 MHz here during the last week (WSJT-X v. 2.5.0):

 

181130 -24  2.0 1118 ~  FT0XHT 5R9KHX JP91

 

104745 -24  2.3 2951 ~  7I7IBM/P QF6GOM/P R KP02

 

122845 -20  4.0  899 ~  PT2KFL/R IP0RT PC08

 

122115 -19  4.1  210 ~  M52FPI/P TX5SPH BQ51

 

73 from Tom, LA4LN

 

 

Fra: Bill Somerville
Sendt: onsdag 3. november 2021 kl. 13.33
Til: main@WSJTX.groups.io
Emne: Re: [WSJTX] WSPR Spot validation #WSPR

 

On 03/11/2021 12:22, Roland wrote:
> It seems many decoded Spots contain wrong information like TX locator
> for example. There is no Data validation in the whole chain which
> leads to problems if you are into Data Analytics.

Hi Roland,

do you have evidence that the stations you mention are sending the wrong
locator in their WSPR beacon transmissions? It may be that the invalid
location information is being added in one of downstream databases.

73
Bill
G4WJS.

 


--
Tom (LA4LN)




 

I don’t know what is the most reliable source of locator. eQSL.cc, LotW, WSJT-X and QRZ.com are all provided by the operator.

 

73 Phil GM3ZZA IO85FU69.

 

Sent from Mail for Windows

 

From: Roland
Sent: 03 November 2021 12:30
To: main@WSJTX.groups.io
Subject: [WSJTX] WSPR Spot validation #WSPR

 

It seems many decoded Spots contain wrong information like TX locator for example. There is no Data validation in the whole chain which leads to problems if you are into Data Analytics.



My proposal is to validate decoded locators against 3rd party DBs like qrz.com etc. on the client-side.Something like this

Compare decoded locator with 3rd party databases (qrz.com etc.) match callsign and locator if no match then discard - no upload, if call/locator is not registered in 3rd party DB accept Spot and upload.

I think this would eliminate a lot of wrong Spots like this one for example.



GM1BAN is registered on qrz.com and his locator is nowhere near the South Atlantic.

What do you think?
73
Roland

 


--
73 Phil GM3ZZA


Roland
 

Every Operator is responsible for his own Data


Roland
 

On Wed, Nov 3, 2021 at 02:03 PM, Michael Black wrote:
It could be filtered with a QRZ lookup.
I have a filter for the JTAlert email interface that does a QRZ lookup to avoid bad decodes like this and uses it's own local cache to reduce the QRZ queries to almost zero.
I also put the filter in my spot filtering program that sits between the cluster and Log4OM to avoid Log4OM triggering on false decodes too.
 
But with the spot reporting being done internally in WSJTX the QRZ validation would have to be added internally unless we migrated it externally.

That would be great to have this functionality integrated into WSJT-X. I like the idea of caching for a certain time

Roland


Michael Black
 

The QRZ cache can be for a VERY long time...maybe a year or more.
The probability that what was a good callsign (silent key or changed) shows up in a bad decode is almost nil.

Mike W9DMB




On Wednesday, November 3, 2021, 08:20:38 AM CDT, Roland <roland@...> wrote:


On Wed, Nov 3, 2021 at 02:03 PM, Michael Black wrote:
It could be filtered with a QRZ lookup.
I have a filter for the JTAlert email interface that does a QRZ lookup to avoid bad decodes like this and uses it's own local cache to reduce the QRZ queries to almost zero.
I also put the filter in my spot filtering program that sits between the cluster and Log4OM to avoid Log4OM triggering on false decodes too.
 
But with the spot reporting being done internally in WSJTX the QRZ validation would have to be added internally unless we migrated it externally.

That would be great to have this functionality integrated into WSJT-X. I like the idea of caching for a certain time


Roland




Roland
 

People are moving, changing locations for many reasons. Therefore I think a year is a long period of time for a client-side call/locator cache.


Michael Black
 

We're trying to avoid bad decodes with bogus callsigns...not grids which could be a rover or somebody temporarily operating who frequently does not update their QRZ location.

The examples given were all bad decodes with invalid callsigns.


Mike



On Wednesday, November 3, 2021, 08:40:24 AM CDT, Roland <roland@...> wrote:


People are moving, changing locations for many reasons. Therefore I think a year is a long period of time for a client-side call/locator cache.




Roland
 

On Wed, Nov 3, 2021 at 02:01 PM, Bill Somerville wrote:
Many of the large volume WSPR spotting sources are not using WSJT-X so I guess they are forming spots from their own decoders, or more likely from running the underlying wsprd decoder application. Do you know if these bad spots are more prevalent from particular types of spot sources?
That's another good question. I think that needs more research. To stick to the example with GM1BAN and the wrong locator it was decoded by WSJT-X 2.1.2




 


Roland
 
Edited

My post is about invalid locators/callsign pairs. GM1BAN is a correct callsign.


Bill Somerville
 

Mike,

please review the thread so far, you may have missed the point.

73
Bill
G4WJS.

On 03/11/2021 13:43, Michael Black via groups.io wrote:
We're trying to avoid bad decodes with bogus callsigns...not grids which could be a rover or somebody temporarily operating who frequently does not update their QRZ location.

The examples given were all bad decodes with invalid callsigns.


Mike



On Wednesday, November 3, 2021, 08:40:24 AM CDT, Roland <roland@...> wrote:


People are moving, changing locations for many reasons. Therefore I think a year is a long period of time for a client-side call/locator cache.



Michael Black
 

There are two points...one is bad decodes...the other is inaccurate grid reports which perhaps come from some internal error like not clearing the grid when a new call is decoded so an old grid gets used by default?

Mike  W9MDB




On Wednesday, November 3, 2021, 08:55:29 AM CDT, Bill Somerville <g4wjs@...> wrote:


Mike,

please review the thread so far, you may have missed the point.

73
Bill
G4WJS.

On 03/11/2021 13:43, Michael Black via groups.io wrote:
We're trying to avoid bad decodes with bogus callsigns...not grids which could be a rover or somebody temporarily operating who frequently does not update their QRZ location.

The examples given were all bad decodes with invalid callsigns.


Mike



On Wednesday, November 3, 2021, 08:40:24 AM CDT, Roland <roland@...> wrote:


People are moving, changing locations for many reasons. Therefore I think a year is a long period of time for a client-side call/locator cache.






Bill Somerville
 

Mike,

QRZ.COM callsign data lookups are not a free service, we have no intention of requiring that WSJT-X WSPR spotters use such a service. Caching such lookup data seems unwise as no data update service is available to my knowledge, and probably storing the lookup data in bulk is against the terms of usage of the QRZ.COM service anyway.

73
Bill
G4WJS.

On 03/11/2021 13:03, Michael Black via groups.io wrote:
It could be filtered with a QRZ lookup.
I have a filter for the JTAlert email interface that does a QRZ lookup to avoid bad decodes like this and uses it's own local cache to reduce the QRZ queries to almost zero.
I also put the filter in my spot filtering program that sits between the cluster and Log4OM to avoid Log4OM triggering on false decodes too.

But with the spot reporting being done internally in WSJTX the QRZ validation would have to be added internally unless we migrated it externally.

Mike W9MDB




On Wednesday, November 3, 2021, 07:40:27 AM CDT, Tom V. Segalstad <la4ln@...> wrote:


 

We see from time to time that wrong decoding of weak signals are plotted on DXMAPS.COM (and other maps). But I assume that such wild decoding and reporting cannot be avoided?

 

Some examples of wrong FT8 decoding on 50.313 MHz here during the last week (WSJT-X v. 2.5.0):

 

181130 -24  2.0 1118 ~  FT0XHT 5R9KHX JP91

 

104745 -24  2.3 2951 ~  7I7IBM/P QF6GOM/P R KP02

 

122845 -20  4.0  899 ~  PT2KFL/R IP0RT PC08

 

122115 -19  4.1  210 ~  M52FPI/P TX5SPH BQ51

 

73 from Tom, LA4LN

 

 

Fra: Bill Somerville
Sendt: onsdag 3. november 2021 kl. 13.33
Til: main@WSJTX.groups.io
Emne: Re: [WSJTX] WSPR Spot validation #WSPR

 

On 03/11/2021 12:22, Roland wrote:
> It seems many decoded Spots contain wrong information like TX locator
> for example. There is no Data validation in the whole chain which
> leads to problems if you are into Data Analytics.

Hi Roland,

do you have evidence that the stations you mention are sending the wrong
locator in their WSPR beacon transmissions? It may be that the invalid
location information is being added in one of downstream databases.

73
Bill
G4WJS.

 


--
Tom (LA4LN)