Dashboard - no Download option?

  • 2
  • Problem
  • Updated 2 months ago
With the changes to the website making things much harder to do anything, I am now struggling to download the data from my weather station.

From the My Devices page, nothing seems to be working properly and no way to get to the dashboard. Pretty crappy, but I have managed to get round that by using the Dashboard URI https://www.wunderground.com/dashboard/pws/xxxxxx

On that pare I used to be able to chose "Table", to select a date and then click "Download" to get the days data.  That option has disappeared.  I have tried in Chrome and Microsoft Edge, on 3 different computers, logging out and back in...not doing anything.  Is there some way I can get my data?
Photo of Rob Thomlinson

Rob Thomlinson

  • 5 Posts
  • 6 Reply Likes

Posted 4 months ago

  • 2
Photo of Tom

Tom

  • 52 Posts
  • 21 Reply Likes
I agree. No .csv on my page either. That would be a very big backup issue for anyone that uses WU as the primary source to process the data from their station (which probably numbers in the tens of thousands at a minimum). ———->This is important Victoria
Photo of Victoria Gardner

Victoria Gardner, Official Rep

  • 620 Posts
  • 88 Reply Likes
I will ask the web site folks, but it may take some time to get an answer.

--Victoria
victoria.gardner@ibm.com
Photo of John Bittorf

John Bittorf

  • 6 Posts
  • 0 Reply Likes
Hi Victoria.  Its been 2 months since this feature has been missing.  Is it coming back? 
Photo of Victoria Gardner

Victoria Gardner, Official Rep

  • 620 Posts
  • 88 Reply Likes
No, not that I know of.

--Victoria
Photo of V. Kelly Bellis

V. Kelly Bellis

  • 5 Posts
  • 1 Reply Like
I completely agree, the ability to download data for not just my weather station, but other weather stations within a given area too, is critical for QC analysis. Removal of the download option greatly depreciates the WU site and raises questions as to the merit/ incentive of sharing my data in the first place.

Please restore the option to download individual weather station data and consider improving its format and download options; e.g., user defined time-related records decimation, filtering of unwanted extraneous fields, ordering of field placement, etc.

Likewise, consider allowing the data from multiple weather stations for a given period to be downloaded in a single .csv One check box option could include all weather stations within a reasonable given radius of station x.

Photo of Victoria Gardner

Victoria Gardner, Official Rep

  • 618 Posts
  • 88 Reply Likes
Kelly, you might want to look into the API.  All that various downloading is what the API is designed for.  It's unlikely that there's going to be a whole new set of options duplicating the API.

--Victoria
Photo of Tom

Tom

  • 52 Posts
  • 21 Reply Likes
Though you make some valid points, I don't see WU as it works now to offer any of these as 'features' though I do believe it should be considered.  Back in the pre-take over days, you could download the .csv file from any station (manually) but immediately after purchase (or to the best of my recollection) it was removed from all stations but your own.  Additionally, you could delete bad data sets, but that option is gone too. In truth I do not believe that WU as it is positioned now actually cares about getting good data or assisting individual PWS owners in assessing the validity of their data (This does not mean that some people - Victoria - are not trying their best, they just don't have much to work with).

I can only suggest that PWS owners try and send their data to something like CWOP that has a built in data analysis system (https://weather.gladstonefamily.net/site/F2338)

Of primary importance is the ability to store and retrieve your own data.  Because of the ability to do this, many (most) manufacturers did not provide a separate method.  It was assumed that by uploading your data to WU, you would be able to retrieve that data later.  I am sure that manufacturers (actually i know they are as I consult on the side) are preparing alternative methods because of the changes to WU.  It would be be in WU's best interest to return this 'feature' though I personally see it as a necessity, but then again There does not seem to be much that anyone can do to return WU to a customer-centric organization.

Numbered are the days where manufactures have built in reporting to WU as the primary means of data collection and maintenance.
(Edited)
Photo of Victoria Gardner

Victoria Gardner, Official Rep

  • 618 Posts
  • 88 Reply Likes
I just want to note again that I haven't gotten an answer on this yet.  I have no idea why it's gone. I've added it to the problem queue in addition to the requested features queue.  

--Victoria
Photo of Claude Felizardo

Claude Felizardo

  • 30 Posts
  • 6 Reply Likes
In the past if I forgot to stop uploads before doing maintenance and inadvertently triggered the rain gauge or sent incorrect values while swapping sensors I could remove known bad data.  We lost that ability.
Photo of Tom

Tom

  • 52 Posts
  • 21 Reply Likes
So very true. Most of us have our equipment mounted on a pole of some nature. Just getting it down to clean the rain gauge will tip the little rain cups inside and trigger a rain event. I generally leave mine online purposely during maintenance.
(Edited)
Photo of V. Kelly Bellis

V. Kelly Bellis

  • 5 Posts
  • 1 Reply Like
Thank you for the reply Victoria.

While I already have an API key, I don't know where it could be employed to download my weather station data.
Photo of Tom

Tom

  • 52 Posts
  • 21 Reply Likes
The API cannot do that. I can, with proper decoding, report what the conditions were at a given time. If you are not familiar with JSON decoding, you are sunk. That was the beauty of the .csv download. Import it into excel and literally almost anyone could understand their data.
Photo of Tom

Tom

  • 52 Posts
  • 21 Reply Likes
For some reason I cannot edit the above post so here it is corrected: The API cannot do that. It can, with proper decoding, report what the conditions were at a given time. If you are not familiar with JSON decoding, you are sunk. That was the beauty of the .csv download. Import it into excel and literally almost anyone could understand their data.
(Edited)
Photo of Victoria Gardner

Victoria Gardner, Official Rep

  • 618 Posts
  • 88 Reply Likes
Actually, Tom is not quite correct. (Tom is almost always right.)  Because I do so much testing, I am constantly throwing API calls around like candy from a holiday float.  

There are environments in which you can just dump the data, such a Postman.  If you're in a secure setting, you can even do it at the URL navigator from your browser.  

Tom is correct that JSON output (which is what the API gives) is not the same as csv (comma separated output).  It includes things like squiggly brackets {} and square brackets [] to define the output.  So depending on what you're doing it won't be the same.  But if you're mostly just storing for backup, it will do that.  

And although Rob below is being a little discouraging, none of this is impossible to learn, even if you don't call yourself "tech savvy".  A lot of people with PWSs find that they become more and more curious about the data, and APIs are a great way to venture into the world of data.  Languages like R are relatively easy to earn (and there are lots of online courses if you'd prefer that), and then you can have a ball with it.  

I have not yet gotten an answer about what's happening with the download button.  I will be sure to share as soon as I learn.

--Victoria
Photo of Tom

Tom

  • 52 Posts
  • 21 Reply Likes
mmmm.....Its a little bit more complicated than squiggly lines.   But I will let you run with that.  lol
(Edited)
Photo of Rob Thomlinson

Rob Thomlinson

  • 5 Posts
  • 6 Reply Likes
My apologies if I appeared discouraging, I would encourage people to code and understand data, in face I do as part of my job every day as a.professional changing the way people can work with statistical data. Unfortunately the user testing we do day in and day out reveals that most people aren't interested in getting their brains round it (or are not capable).

When you look at the people who give Weather Underground their data free every day, many will have spent years keeping logs in notebooks, progressed onto having more sophisticated equipment and still kept a notebook record, progressed to having a spreadsheet and eventually got their data as a CSV straight into their spreadsheet (which is dead easy).

By requiring people to unpack a 3 level JSON dataset, or code a solution and wrangle the data into something useful...that is a huge leap which is more likely to just lose you all the people who give you data for free

This is something we made the same assumption as you, and over the last year have tested it with data professionals and had it thrown back in our faces. God luck in your beliefs, but I'm afraid you are more likely to lose your contributors than succeed in getting them to work with the API
Photo of Victoria Gardner

Victoria Gardner, Official Rep

  • 618 Posts
  • 88 Reply Likes
Thank you for sharing that.  The utility of having discussions like these is why we have the Forum in the first place.  

--Victoria
Photo of Claude Felizardo

Claude Felizardo

  • 30 Posts
  • 6 Reply Likes
I agree, you don't need to write any code to be able to import a CSV file into a spreadsheet or plotting package.
Photo of Rob Thomlinson

Rob Thomlinson

  • 5 Posts
  • 6 Reply Likes
An API might be a solution for the very tech savvy, but I can tell you from experience, most people won't have a clue how to use it, or even what it is. An API is a geek feature, not a general user feature
Photo of V. Kelly Bellis

V. Kelly Bellis

  • 5 Posts
  • 1 Reply Like
I only got the WU API key to try and use it in weewx; notably, in the forecast extension. Sadly, the new WU key and related commands doesn't work with weewx. Matt Wall or somebody else, will need to fix the thing that wasn't broken until recently. As for decoding JSON, I don't have that skill.

The idea of trying to QC my own station is in preparation to sending it off to CWOP via weewx, something I've been reluctant to do just yet.
(Edited)
Photo of Tom

Tom

  • 52 Posts
  • 21 Reply Likes
As far as I know CWOP and Awekas are the only organizations with a built in QC system. https://www.awekas.at/en/qualcheck.ph...
Photo of Claude Felizardo

Claude Felizardo

  • 28 Posts
  • 6 Reply Likes
I upload to both WU and CWOP and look at both as well as other analysis web pages frequently to confirm my data is being received.   
Photo of Tom

Tom

  • 52 Posts
  • 21 Reply Likes
So.....On a whimsical idea, i wanted to see if I could find an alternative method of doing what the .csv used to do, and surprisingly enough to me you can now do something that you did not used to be able to do.  Simply copy the table and paste it into excel!  I would prefer not to have the units on every single cell, but at least I can save and sort.


(Edited)
Photo of John Bittorf

John Bittorf

  • 6 Posts
  • 0 Reply Likes
To separate the units from the values, paste your data into a txt file and import using excel.  You can then use "space" deliminator to separate them.
(Edited)
Photo of V. Kelly Bellis

V. Kelly Bellis

  • 5 Posts
  • 1 Reply Like
Thanks Tom for sharing this bit of whimsy! It's a doable workaround until WU returns the download option.

As for CWOP, I believe that it's actually MADIS that's doing the QC math, though I've yet to finish my own QC experiments and learn more about what's entailed on their end.
Photo of Tom

Tom

  • 52 Posts
  • 21 Reply Likes
And I have to say that I love what MADIS is the acronym for ... the Meteorological Assimilation Data Ingest System (MADIS)! Makes me hungry just typing it.


But yes you are correct. MADIS, AWIPS, MesoWest, NOAA, NWS are all linked together from an individual PWS owner’s standpoint. Report to CWOP and you get them all.
(Edited)
Photo of Wade Gibson

Wade Gibson

  • 7 Posts
  • 0 Reply Likes
+1 for missing the Download to CSV option in Table View. I noticed that we are also missing the Custom date range and temperature values are round to the nearest whole degree. I used these features with my PWS twice a month when finding the low, high, and average temperatures between propane tank fillings and reading the electrical meter.
Photo of Tom

Tom

  • 52 Posts
  • 21 Reply Likes
Hey +1 for the custom date.   I did not catch that.  

Also a couple of other things.

Monthly mode produces a table/graph for the month selected.  No averages by month or annual table or chart.  Many of us use this annual table for a 'summary' of the year.

Also missing the gust column in weekly and monthly mode.  We get a maximum value for gust but have no idea what day or time. 

Also, we would much rather have the single decimal (48.2)
rather than the units in every column (48 oF).

Tom
Photo of Rob Thomlinson

Rob Thomlinson

  • 5 Posts
  • 6 Reply Likes
For anyone still struggling with this, I have written a short Python program (below) which downloads the daily data into an Excel spreadsheet.  It creates a new workbook per day, with the daily information on the first tab, and summary data for the day on the second tab.  It has been built and tested for our weather station data so you may need to modify it to get what you want.  Feel free to use it and share it as you want.

You need to change 5 things (notes in square brackets) in the application to get it to work:

At the top, you need :

Your weather station ID and API
The first and last date you want to grab the data for

At the bottom you need the path where you want the data to be stored

I have hard coded the column names I want and the aggregations I want and that work with our data.  If you want something different you will need to change it.  Afraid I won't be able to help with that.  

For reference, this was written using the Python 3.7 Anaconda distribution 

*******************************************************************************************

import requests
import pandas as pd

station_ID = '[Your Station ID]'
station_key = '[Your API Key]'
start_date = [Date in yyyymmdd format, eg 20190414] 
end_date = [Date in yyyymmdd format]

for x in range(start_date, end_date + 1):
    WU_date = str(x)
    
    url = 'https://api.weather.com/v2/pws/history/all?stationId=' + station_ID + '&format=json&units=m&date=' + WU_date + '&apiKey=' + station_key
    df = pd.read_json(url)
    tf = pd.read_json( (df['observations']).to_json(), orient='index')
    tf2 = pd.read_json( (tf['metric']).to_json(),  orient='index')
    tf3 = tf.join(tf2)
    
    tf_out = tf3[['obsTimeLocal', 'tempHigh', 'dewptHigh', 'winddirAvg', 'windspeedAvg','windgustHigh','pressureMax',
                  'humidityAvg', 'precipRate', 'precipTotal', 'uvHigh', 'solarRadiationHigh']]
    
    tf_sum = {'Record' :['obsTimeLocal', 'tempHigh', 'tempLow', 'dewptHigh', 'dewptLow', 'winddirAvg', 'windspeedAvg','windgustHigh',
                         'pressureMax', 'pressureMin', 'humidityAvg','precipRate', 'precipTotal', 'uvHigh', 'solarRadiationHigh'],
            'Observation' :[tf3['obsTimeLocal'].max(), tf3['tempHigh'].max(), tf3['tempLow'].min(), tf3['dewptHigh'].max(), 
                            tf3['dewptLow'].min(), tf3['winddirAvg'].mean(), tf3['windspeedAvg'].mean(), tf3['windgustHigh'].max(), 
                            tf3['pressureMax'].max(), tf3['pressureMin'].min(), tf3['humidityAvg'].mean(), tf3['precipRate'].max(), 
                            tf3.at[275, 'precipTotal'], tf3['uvHigh'].mean(), tf3['solarRadiationHigh'].mean()]
    }
    
    tf_sum_base = pd.DataFrame(tf_sum, columns=['Record', 'Observation'])
    tf_sum_out = tf_sum_base.transpose()
    
    print(WU_date)
    
    out_path = "[path to the folder where you want to store the data, e.g. "//net1/underground Data"]" + WU_date + ".xlsx"
    with pd.ExcelWriter(out_path) as writer:
        tf_out.to_excel(writer , sheet_name= WU_date)
        tf_sum_out.to_excel(writer , sheet_name= 'Summary')
Photo of V. Kelly Bellis

V. Kelly Bellis

  • 5 Posts
  • 1 Reply Like
I will test this out later, but had to first say how very much I appreciate you doing this!

Earlier this week I collected daily data downloads manually, one day at a time for 75 days.

And just as a precaution, am I understanding correctly that
start_date = [Date in 20190201]
end_date = [Date in 20190416]
will accomplish getting data for 75 days?

Thanks again for making this and for sharing!!

Kind regards,

Kelly

Photo of Rob Thomlinson

Rob Thomlinson

  • 5 Posts
  • 6 Reply Likes
I'm afraid the code is a bit more clunky than that sorry. You can do all the days in a single month at a time, but have to do the 3 separate months.

I keep trying to get round to building a bit more intelligence in to it, but at the moment it uses the dates as numbers, so it doesn't know 20190250 isn't a date, and breaks when it can't get a record. At least it would only be 3 runs, not 75...

It will create a workbook for each date you want. .