Author Topic: Kibana dashboard for WS  (Read 3273 times)

0 Members and 1 Guest are viewing this topic.

Offline rwa2

  • Member
  • *
  • Posts: 3
Kibana dashboard for WS
« on: December 07, 2015, 05:02:02 PM »
Hello, just wanted to share my notes on getting my WS-1400-IP data into an ELK stack (Elasticsearch, Logstash, Kibana).  This may be a bit overkill, but lets you collect and graph all the data in flexible dashboards.  Probably too heavy to run on a Raspberry Pi, but works fine so far on my 4GB Atom server.  Screenshot of a quick sample dashboard attached.

Step 1:  Get an ELK stack going
You can install these individually if you want... I just used https://github.com/deviantony/docker-elk to quickly get it up and running using docker-compose.
One small tweak you'll need:  go to logstash/config/logstash.conf and add a json filter, so it looks like:
Code: [Select]
## Add your filters here
filter {
  json {
    source => "message"
  }
}
Then you can go ahead and run "docker-compose up"...  it'll download a few hundred MB of various container images and then fire up the services.   Mind that you really don't want any of this stuff exposed to the public internet, so make sure it's all behind your firewall.

Step 2. Dump data from your weather station
Here's a python script that scrapes data from my WS-1400-IP's livedata.htm page, and turns it into json that I can dump into Logstash on port 5000.  If you have a different type of weather station, this is the only thing you'd need to modify to dump its values out in json.
Code: [Select]
#!/usr/bin/python

from lxml import html
import requests
import simplejson as json
from decimal import *

ws_ip = '192.168.1.36'
ws_id = 'ws1'

page = requests.get('http://' + ws_ip + '/livedata.htm')
tree = html.fromstring(page.content)
data = tree.xpath('//input[@name]')

def num_or_str(s):
    try:
        return Decimal(s)
    except (ValueError, InvalidOperation):
        return s

ws = {}
ws[ws_id] = {}
for node in data:
    # print("%s %s" % (node.name,node.value))
    ws[ws_id][node.name] = num_or_str(node.value)

print(json.dumps(ws, sort_keys=True))

Then just run that in a script or cron job to netcat its output to logstash on port 5000, like:
Code: [Select]
while true; do
  python ./ws_parse.py > output.json && cat output.json | nc localhost 5000
  sleep 600  # poll every 10 minutes
done

Step 3: Configure Kibana 4
Go to kibana at http://localhost:5601/ Not the most intuitive interface, but you should be able to poke around and figure it out.
* The Discover pane should list the data available in Elasticsearch on the left, and you can add fields to the main table view.  Use the Time filter on the top right to select enough data points to make things interesting.
* The Visualize pane lets you create line graphs, bar charts, etc.  For most things, you'll want to start by setting the "Bucket | X-Axis | Aggregation" to "Date Timestamp", and then expand the Y-Axis and set the "Aggregation" to "Average" and select one of the numeric fields available.  Then hit the green arrow to attempt to graph it.  Once you're happy with the graph, use the Save button to give that visualization a name.
* The Dashboard pane then lets you assemble and arrange several named visualizations, then you can save those and go back to them.

Visualization survey
So Kibana wasn't really purpose-built to show weather stuff, but it would be neat to see what interesting displays other people can come up with!

For example, Kibana doesn't seem to have a "radial graph" that would be good for plotting wind direction, so I abused the Pie Chart to create a "Prevailing winds" view as follows:
* Filtered for:  "NOT ws1.avgwind:0"   so it strips out all the data points when the wind wasn't actually blowing
* Binned ws1.windir in 90-deg increments, so the pie chart shows the rough percentage of time the wind was blowing from the NE, SE, SW, NW quadrants.

Future:
Just wanted to get this out here, in case anyone else is interested in playing with this approach.  There's still plenty of work to do:
* Add a read-only reverse proxy to kibana, so we can more safely expose the Kibana dashboard to the public internet.  Add an https login page as well for full remote access to Kibana.  Just need to configure an nginx container and include it with the docker-compose setup.
* Insert lat/lon data to the parser or somewhere, so we can use Kibana's map display.  That'd be pretty boring for the 1 fixed station that most of us have, but hey, maybe someday you'll be logging several stations on boats or something ;)
* Package the parser script in a docker container, which will get rid of Step 2.  And maybe add some maintenance scripts to compress/archive old data before it eats up your disk, or to migrate archives from an older ELK to an updated set of ELK containers.

Offline Bushman

  • Forecaster
  • *****
  • Posts: 7549
    • Eagle Bay Weather
Re: Kibana dashboard for WS
« Reply #1 on: December 07, 2015, 06:02:47 PM »
Cool.  But there are similar free services.
Need low cost IP monitoring?  http://wirelesstag.net/wta.aspx?link=NisJxz6FhUa4V67/cwCRWA or PM me for 50% off Wirelesstags!!

 

anything