The personal website of Scott W Harden
June 10th, 2013

Precision Temperature Measurement

In an effort to resume previous work [A, B, C, D] on developing a crystal oven for radio frequency transmitter / receiver stabilization purposes, the first step for me was to create a device to accurately measure and log temperature. I did this with common, cheap components, and the output is saved to the computer (over 1,000 readings a second). Briefly, I use a LM335 precision temperature sensor ($0.70 on mouser) which outputs voltage with respect to temperature. It acts like a Zener diode where the breakdown voltage relates to temperature. 2.95V is 295K (Kelvin), which is 22ºC / 71ºF. Note that Kelvin is just ºC + 273.15 (the difference between freezing and absolute zero). My goal was to use the ADC of a microcontroller to measure the output. The problem is that my ADC (one of 6 built into the ATMEL ATMega8 microcontroller) has 10-bit resolution, reporting steps from 0-5V as values from 0-1024. Thus, each step represents 0.0049V (0.49ºC / 0.882ºF). While ~1ºF resolution might be acceptable for some temperature measurement or control applications, I want to see fractions of a degree because radio frequency crystal temperature stabilization is critical. Here's a video overview.

This is the circuit came up with. My goal was to make it cheaply and what I had on hand. It could certainly be better (more stable, more precise, etc.) but this seems to be working nicely. The idea is that you set the gain (the ratio of R2/R1) to increase your desired resolution (so your 5V of ADC recording spans over just several ºF you're interested in), then set your "base offset" temperature that will produce 0V. In my design, I adjusted so 0V was room temperature, and 5V (maximum) was body temperature. This way when I touched the sensor, I'd watch temperature rise and fall when I let go. Component values are very non-critical. LM324 is powered 0V GND and +5V Vcc. I chose to keep things simple and use a single rail power supply. It is worth noting that I ended-up using a 3.5V Zener diode for the positive end of the potentiometer rather than 5V. If your power supply is well regulated 5V will be no problem, but as I was powering this with USB I decided to go for some extra stability by using a Zener reference.

On the microcontroller side, analog-to-digital measurement is summed-up pretty well in the datasheet. There is a lot of good documentation on the internet about how to get reliable, stable measurements. Decoupling capacitors, reference voltages, etc etc. That's outside the scope of today's topic. In my case, the output of the ADC went into the ATMega8 ADC5 (PC5, pin 28). Decoupling capacitors were placed at ARef and AVcc, according to the datasheet. Microcontroller code is at the bottom of this post.

To get the values to the computer, I used the USART capability of my microcontroller and sent ADC readings (at a rate over 1,000 a second) over a USB adapter based on an FTDI FT232 chip. I got e-bay knock-off FTDI evaluation boards which come with a USB cable too (they're about $6, free shipping). Yeah, I could have done it cheaper, but this works effortlessly. I don't use a crystal. I set fuse settings so the MCU runs at 8MHz, and thanks to the nifty online baud rate calculator determined I can use a variety of transfer speeds (up to 38400). At 1MHz (if DIV8 fuse bit is enabled) I'm limited to 4800 baud. Here's the result, it's me touching the sensor with my finger (heating it), then letting go.

Touching the temperature sensor with my finger, voltage rose exponentially. When removed, it decayed exponentially - a temperature RC circuit, with capacitance being the specific heat capacity of the sensor itself. Small amounts of jitter are expected because I'm powering the MCU from unregulated USB +5V.[/caption]

I spent a while considering fancy ways to send the data (checksums, frame headers, error correction, etc.) but ended-up just sending it old fashioned ASCII characters. I used to care more about speed, but even sending ASCII it can send over a thousand ADC readings a second, which is plenty for me. I ended-up throttling down the output to 10/second because it was just too much to log comfortable for long recordings (like 24 hours). In retrospect, it would have made sense to catch all those numbers and do averaging on the on the PC side.

I keep my house around 70F at night when I'm there, and you can see the air conditioning kick on and off. In the morning the AC was turned off for the day, temperature rose, and when I got back home I turned the AC on and it started to drop again.[/caption]

On the receive side, I have nifty Python with PySerial ready to catch data coming from the microcontroller. It's decoded, turned to values, and every 1000 receives saves a numpy array as a NPY binary file. I run the project out of my google drive folder, so while I'm at work I can run the plotting program and it loads the NPY file and shows it - today it allowed me to realize that my roommate turned off the air conditioning after I left, because I saw the temperature rising mid-day. The above graph is temperature in my house for the last ~24 hours. That's about it! Here's some of the technical stuff.

AVR ATMega8 microcontroller code:

#define F_CPU 8000000UL
#include <avr/io.h>
#include <util/delay.h>
#include <avr/interrupt.h>

/*
8MHZ: 300,600,1200,2400,4800,9600,14400,19200,38400
1MHZ: 300,600,1200,2400,4800
*/
#define USART_BAUDRATE 38400
#define BAUD_PRESCALE (((F_CPU / (USART_BAUDRATE * 16UL))) - 1)

/*
ISR(ADC_vect)
{
    PORTD^=255;
}
*/

void USART_Init(void){
    UBRRL = BAUD_PRESCALE;
    UBRRH = (BAUD_PRESCALE >> 8);
    UCSRB = (1<<TXEN);
    UCSRC = (1<<URSEL)|(1<<UCSZ1)|(1<<UCSZ0); // 9N1
}

void USART_Transmit( unsigned char data ){
    while ( !( UCSRA & (1<<UDRE)) );
    UDR = data;
}

void sendNum(long unsigned int byte){
    if (byte==0){
        USART_Transmit(48);
    }
    while (byte){
        USART_Transmit(byte%10+48);
        byte-=byte%10;
        byte/=10;
    }

}

unsigned int readADC(char adcn){
    ADMUX = 0b0100000+adcn;
    ADCSRA |= (1<<ADSC); // reset value
    while (ADCSRA & (1<<ADSC)) {}; // wait for measurement
    return ADC>>6;
}

void ADC_Init(){
    // ADC Enable, Prescaler 128
    ADCSRA = (1<<ADEN)  | 0b111;
}

int main(void){
    //DDRD=255;
    USART_Init();
    ADC_Init();
    for(;;){
        sendNum(readADC(5));
        USART_Transmit('n');
        _delay_ms(100);
    }
}

Here is the Python code to receive the data and log it to disk:

import serial, time
import numpy
ser = serial.Serial("COM15", 38400, timeout=100)

line=ser.readline()[:-1]
t1=time.time()
lines=0

data=[]

while True:
    line=ser.readline()[:-1]

    if "," in line:
        line=line.split(",")
        for i in range(len(line)):
            line[i]=line[i][::-1]
    else:
        line=[line[::-1]]
    temp=int(line[0])
    lines+=1
    data.append(temp)
    print "#",
    if lines%1000==999:
        numpy.save("DATA.npy",data)
        print
        print line
        print "%d lines in %.02f sec (%.02f vals/sec)"%(lines,
                time.time()-t1,lines/(time.time()-t1))

Here is the Python code to plot the data that has been saved:

import numpy
import pylab

data=numpy.load("DATA.npy")
print data
data=data*.008 #convert to F
xs=numpy.arange(len(data))/9.95  #vals/sec
xs=xs/60.0# minutes
xs=xs/60.0# hours

pylab.plot(xs,data)
pylab.grid(alpha=.5)
pylab.axis([None,None,0*.008,1024*.008])
pylab.ylabel(r'$Delta$ Fahrenheit')
pylab.xlabel("hours")
pylab.show()
Markdown source code last modified on January 18th, 2021
---
title: Precision Temperature Measurement
date: 2013-06-10 22:25:27
tags: microcontroller, old
---

# Precision Temperature Measurement

__In an effort to resume previous work \[[A](http://www.swharden.com/blog/2010-11-24-atmega48-lm335-max232-serial-port-multi-channel-temperature-measurement/), [B](http://www.swharden.com/blog/2010-11-28-crystal-oven-experiments/), [C](http://www.swharden.com/blog/2010-08-27-hacking-together-a-crystal-oven-part-2/), [D](http://www.swharden.com/blog/2010-08-26-minimalist-crystal-oven/)\] on developing a crystal oven for radio frequency transmitter / receiver stabilization purposes, the first step for me was to create a device to accurately measure and log temperature.__ I did this with common, cheap components, and the output is saved to the computer (over 1,000 readings a second). Briefly, I use a [LM335 precision temperature sensor](http://www.ti.com/lit/ds/symlink/lm335.pdf) ([$0.70 on mouser](http://www.mouser.com/ProductDetail/STMicroelectronics/LM335Z/?qs=sGAEpiMZZMusbZ2pNxAMx3IjjBanxLGdnwZerf04Dlo%3d)) which outputs voltage with respect to temperature. It acts like a [Zener diode](http://en.wikipedia.org/wiki/Zener_diode) where the breakdown voltage relates to temperature. 2.95V is 295K (Kelvin), which is 22ºC / 71ºF. Note that Kelvin is just ºC + 273.15 (the difference between freezing and [absolute zero](http://en.wikipedia.org/wiki/Absolute_zero)). My goal was to use the [ADC ](http://en.wikipedia.org/wiki/Analog_digital_converter)of a microcontroller to measure the output. The problem is that my [ADC ](http://en.wikipedia.org/wiki/Analog_digital_converter)(one of 6 built into the [ATMEL ATMega8 microcontroller](http://www.atmel.com/Images/Atmel-2486-8-bit-AVR-microcontroller-ATmega8_L_datasheet.pdf)) has 10-bit resolution, reporting steps from 0-5V as values from 0-1024. Thus, each step represents 0.0049V (0.49ºC / 0.882ºF). While ~1ºF resolution might be acceptable for _some_ temperature measurement or control applications, I want to see fractions of a degree because radio frequency crystal temperature stabilization is critical. Here's a video overview.

![](https://www.youtube.com/embed/LTPncC2e3Zo)

__This is the circuit came up with.__ My goal was to make it cheaply and what I had on hand. It could certainly be better (more stable, more precise, etc.) but this seems to be working nicely. The idea is that you set the gain (the ratio of R2/R1) to increase your desired resolution (so your 5V of ADC recording spans over just several ºF you're interested in), then set your "base offset" temperature that will produce 0V. In my design, I adjusted so 0V was room temperature, and 5V (maximum) was body temperature. This way when I touched the sensor, I'd watch temperature rise and fall when I let go.  Component values are very non-critical. LM324 is powered 0V GND and +5V Vcc. I chose to keep things simple and use a single rail power supply. It is worth noting that I ended-up using a 3.5V Zener diode for the positive end of the potentiometer rather than 5V.  If your power supply is well regulated 5V will be no problem, but as I was powering this with USB I decided to go for some extra stability by using a Zener reference.

<div class="text-center img-border">

[![](precision-thermometer-LM335-LM324-microcontroller_thumb.jpg)](precision-thermometer-LM335-LM324-microcontroller.jpg)

</div>

__On the microcontroller side, analog-to-digital measurement is summed-up pretty well in the datasheet.__ There is a lot of good documentation on the internet about how to get reliable, stable measurements. Decoupling capacitors, reference voltages, etc etc. That's outside the scope of today's topic. In my case, the output of the ADC went into the ATMega8 ADC5 (PC5, pin 28). Decoupling capacitors were placed at ARef and AVcc, according to the datasheet. Microcontroller code is at the bottom of this post.

<div class="text-center">

![](photo-3.jpg)

</div>

__To get the values to the computer, I used the USART capability of my microcontroller and sent ADC readings (at a rate over 1,000 a second) over a USB adapter based on an FTDI FT232 chip.__ I got e-bay knock-off FTDI evaluation boards which come with a USB cable too (they're about $6, free shipping). Yeah, I could have done it cheaper, but this works effortlessly. I don't use a crystal. I set [fuse settings](http://www.engbedded.com/fusecalc) so the MCU runs at 8MHz, and thanks to the [nifty online baud rate](http://www.wormfood.net/avrbaudcalc.php) calculator determined I can use a variety of transfer speeds (up to 38400). At 1MHz (if DIV8 fuse bit is enabled) I'm limited to 4800 baud. Here's the result, it's me touching the sensor with my finger (heating it), then letting go.

<div class="text-center">

[![](finger-touch_thumb.jpg)](finger-touch.png)

</div>

Touching the temperature sensor with my finger, voltage rose exponentially. When removed, it decayed exponentially - a temperature RC circuit, with capacitance being the specific heat capacity of the sensor itself. Small amounts of jitter are expected because I'm powering the MCU from unregulated USB +5V.[/caption]

__I spent a while considering fancy ways to send the data__ (checksums, frame headers, error correction, etc.) but ended-up just sending it old fashioned ASCII characters. I used to care more about speed, but even sending ASCII it can send over a thousand ADC readings a second, which is plenty for me. I ended-up throttling down the output to 10/second because it was just too much to log comfortable for long recordings (like 24 hours). In retrospect, it would have made sense to catch all those numbers and do averaging on the on the PC side.

<div class="text-center">

[![](ac2_thumb.jpg)](ac2.png)

</div>

I keep my house around 70F at night when I'm there, and you can see the air conditioning kick on and off. In the morning the AC was turned off for the day, temperature rose, and when I got back home I turned the AC on and it started to drop again.[/caption]

__On the receive side, I have nifty Python with [PySerial ](http://pyserial.sourceforge.net/)ready to catch data coming from the microcontroller. __It's decoded, turned to values, and every 1000 receives [saves a numpy array as a NPY binary file](http://docs.scipy.org/doc/numpy/reference/generated/numpy.save.html). I run the project out of my google drive folder, so while I'm at work I can run the plotting program and it loads the NPY file and shows it - today it allowed me to realize that my roommate turned off the air conditioning after I left, because I saw the temperature rising mid-day. The above graph is temperature in my house for the last ~24 hours. That's about it! Here's some of the technical stuff.

AVR ATMega8 microcontroller code:

```c
#define F_CPU 8000000UL
#include <avr/io.h>
#include <util/delay.h>
#include <avr/interrupt.h>

/*
8MHZ: 300,600,1200,2400,4800,9600,14400,19200,38400
1MHZ: 300,600,1200,2400,4800
*/
#define USART_BAUDRATE 38400
#define BAUD_PRESCALE (((F_CPU / (USART_BAUDRATE * 16UL))) - 1)

/*
ISR(ADC_vect)
{
    PORTD^=255;
}
*/

void USART_Init(void){
    UBRRL = BAUD_PRESCALE;
    UBRRH = (BAUD_PRESCALE >> 8);
    UCSRB = (1<<TXEN);
    UCSRC = (1<<URSEL)|(1<<UCSZ1)|(1<<UCSZ0); // 9N1
}

void USART_Transmit( unsigned char data ){
    while ( !( UCSRA & (1<<UDRE)) );
    UDR = data;
}

void sendNum(long unsigned int byte){
    if (byte==0){
        USART_Transmit(48);
    }
    while (byte){
        USART_Transmit(byte%10+48);
        byte-=byte%10;
        byte/=10;
    }

}

unsigned int readADC(char adcn){
    ADMUX = 0b0100000+adcn;
    ADCSRA |= (1<<ADSC); // reset value
    while (ADCSRA & (1<<ADSC)) {}; // wait for measurement
    return ADC>>6;
}

void ADC_Init(){
    // ADC Enable, Prescaler 128
    ADCSRA = (1<<ADEN)  | 0b111;
}

int main(void){
    //DDRD=255;
    USART_Init();
    ADC_Init();
    for(;;){
        sendNum(readADC(5));
        USART_Transmit('n');
        _delay_ms(100);
    }
}
```

Here is the Python code to receive the data and log it to disk:

```python
import serial, time
import numpy
ser = serial.Serial("COM15", 38400, timeout=100)

line=ser.readline()[:-1]
t1=time.time()
lines=0

data=[]

while True:
    line=ser.readline()[:-1]

    if "," in line:
        line=line.split(",")
        for i in range(len(line)):
            line[i]=line[i][::-1]
    else:
        line=[line[::-1]]
    temp=int(line[0])
    lines+=1
    data.append(temp)
    print "#",
    if lines%1000==999:
        numpy.save("DATA.npy",data)
        print
        print line
        print "%d lines in %.02f sec (%.02f vals/sec)"%(lines,
                time.time()-t1,lines/(time.time()-t1))
```

Here is the Python code to plot the data that has been saved:

```python
import numpy
import pylab

data=numpy.load("DATA.npy")
print data
data=data*.008 #convert to F
xs=numpy.arange(len(data))/9.95  #vals/sec
xs=xs/60.0# minutes
xs=xs/60.0# hours

pylab.plot(xs,data)
pylab.grid(alpha=.5)
pylab.axis([None,None,0*.008,1024*.008])
pylab.ylabel(r'$Delta$ Fahrenheit')
pylab.xlabel("hours")
pylab.show()
```
June 3rd, 2013

Realtime image pixelmap from Numpy array data in Qt

Consider realtime spectrograph software like QRSS VD. It's primary function is to scroll a potentially huge data-rich image across the screen. In Python, this is often easier said than done.__ If you're not careful, you can tackle this problem inefficiently and get terrible frame rates (<5FPS) or eat a huge amount of system resources (I get complaints often that QRSS VD takes up a lot of processor resources, and 99% of it is drawing the images). In the past, I've done it at least 4 different ways (one, two, three, four, five). Note that "four" seems to be the absolute fastest option so far. I've been keeping an eye out for a while now contemplating the best way to rapidly draw color-mapped 8-bit data in a python program. Now that I'm doing a majority of my graphical development with PyQt and QtDesigner (packaged with PythonXY), I ended-up with a solution that looks like this (plotting random data with a colormap):

1.) in QtDesigner, create a form with a scrollAreaWidget

2.) in QtDesigner, add a label inside the scrollAreaWidget

3.) in code, resize label and also scrollAreaWidgetContents to fit data (disable "widgetResizable")

4.) in code, create a QImage from a 2D numpy array (dtype=uint8)

5.) in code, set label pixmap to QtGui.QPixmap.fromImage(QImage)

That's pretty much it! Here are some highlights of my program. Note that the code for the GUI is in a separate file, and must be downloaded from the ZIP provided at the bottom. Hope it helps someone else out there who might want to do something similar!

import ui_main
import sys
from PyQt4 import QtCore, QtGui

import sys
from PyQt4 import Qt
import PyQt4.Qwt5 as Qwt
from PIL import Image
import numpy
import time

spectroWidth=1000
spectroHeight=1000

a=numpy.random.random(spectroHeight*spectroWidth)*255
a=numpy.reshape(a,(spectroHeight,spectroWidth))
a=numpy.require(a, numpy.uint8, 'C')

COLORTABLE=[]
for i in range(256): COLORTABLE.append(QtGui.qRgb(i/4,i,i/2))

def updateData():
    global a
    a=numpy.roll(a,-5)
    QI=QtGui.QImage(a.data, spectroWidth, spectroHeight, QtGui.QImage.Format_Indexed8)
    QI.setColorTable(COLORTABLE)
    uimain.label.setPixmap(QtGui.QPixmap.fromImage(QI))

if __name__ == "__main__":
    app = QtGui.QApplication(sys.argv)
    win_main = ui_main.QtGui.QWidget()
    uimain = ui_main.Ui_win_main()
    uimain.setupUi(win_main)

    # SET UP IMAGE
    uimain.IM = QtGui.QImage(spectroWidth, spectroHeight, QtGui.QImage.Format_Indexed8)
    uimain.label.setGeometry(QtCore.QRect(0,0,spectroWidth,spectroHeight))
    uimain.scrollAreaWidgetContents.setGeometry(QtCore.QRect(0,0,spectroWidth,spectroHeight))

    # SET UP RECURRING EVENTS
    uimain.timer = QtCore.QTimer()
    uimain.timer.start(.1)
    win_main.connect(uimain.timer, QtCore.SIGNAL('timeout()'), updateData)

    ### DISPLAY WINDOWS
    win_main.show()
    sys.exit(app.exec_())
Markdown source code last modified on January 18th, 2021
---
title: Realtime image pixelmap from Numpy array data in Qt
date: 2013-06-03 22:40:56
tags: python, old
---

# Realtime image pixelmap from Numpy array data in Qt

Consider realtime spectrograph software like [QRSS VD](http://www.swharden.com/blog/qrss_vd/#screenshots).  It's primary function is to scroll a potentially huge data-rich image across the screen. In Python, this is often easier said than done.__ If you're not careful, you can tackle this problem inefficiently and get terrible frame rates (<5FPS) or eat a huge amount of system resources (I get complaints often that QRSS VD takes up a lot of processor resources, and 99% of it is drawing the images).  In the past, I've done it at least 4 different ways ([one](http://www.swharden.com/blog/2010-03-05-animated-realtime-spectrograph-with-scrolling-waterfall-display-in-python/), [two](http://www.swharden.com/blog/2013-05-09-realtime-fft-audio-visualization-with-python/), [three](http://www.swharden.com/blog/qrss_vd/#screenshots), [four](http://www.swharden.com/blog/2010-06-24-fast-tk-pixelmap-generation-from-2d-numpy-arrays-in-python/), [five](http://www.swharden.com/blog/2010-03-05-realtime-fft-graph-of-audio-wav-file-or-microphone-input-with-python-scipy-and-wckgraph/)). Note that "four" seems to be the absolute fastest option so far. I've been keeping an eye out for a while now contemplating the best way to rapidly draw color-mapped 8-bit data in a python program. Now that I'm doing a majority of my graphical development with PyQt and QtDesigner (packaged with [PythonXY](https://code.google.com/p/pythonxy/)), I ended-up with a solution that looks like this (plotting random data with a colormap):


<div class="text-center img-border">

![](qt-scrolling-spectrograph.gif)

</div>

1.) in QtDesigner, create a form with a **scrollAreaWidget**

2.) in QtDesigner, add a **label** inside the **scrollAreaWidget**

3.) in code, resize **label** and also **scrollAreaWidgetContents **to fit data (disable "widgetResizable")

4.) in code, create a **QImage** from a 2D numpy array (dtype=uint8)

5.) in code, set **label** pixmap to QtGui.QPixmap.fromImage(**QImage**)

That's pretty much it! Here are some highlights of my program. Note that the code for the GUI is in a separate file, and must be downloaded from the ZIP provided at the bottom. Hope it helps someone else out there who might want to do something similar!

```python
import ui_main
import sys
from PyQt4 import QtCore, QtGui

import sys
from PyQt4 import Qt
import PyQt4.Qwt5 as Qwt
from PIL import Image
import numpy
import time

spectroWidth=1000
spectroHeight=1000

a=numpy.random.random(spectroHeight*spectroWidth)*255
a=numpy.reshape(a,(spectroHeight,spectroWidth))
a=numpy.require(a, numpy.uint8, 'C')

COLORTABLE=[]
for i in range(256): COLORTABLE.append(QtGui.qRgb(i/4,i,i/2))

def updateData():
    global a
    a=numpy.roll(a,-5)
    QI=QtGui.QImage(a.data, spectroWidth, spectroHeight, QtGui.QImage.Format_Indexed8)
    QI.setColorTable(COLORTABLE)
    uimain.label.setPixmap(QtGui.QPixmap.fromImage(QI))

if __name__ == "__main__":
    app = QtGui.QApplication(sys.argv)
    win_main = ui_main.QtGui.QWidget()
    uimain = ui_main.Ui_win_main()
    uimain.setupUi(win_main)

    # SET UP IMAGE
    uimain.IM = QtGui.QImage(spectroWidth, spectroHeight, QtGui.QImage.Format_Indexed8)
    uimain.label.setGeometry(QtCore.QRect(0,0,spectroWidth,spectroHeight))
    uimain.scrollAreaWidgetContents.setGeometry(QtCore.QRect(0,0,spectroWidth,spectroHeight))

    # SET UP RECURRING EVENTS
    uimain.timer = QtCore.QTimer()
    uimain.timer.start(.1)
    win_main.connect(uimain.timer, QtCore.SIGNAL('timeout()'), updateData)

    ### DISPLAY WINDOWS
    win_main.show()
    sys.exit(app.exec_())
```
May 19th, 2013

Wireless Microcontroller / PC Interface for $3.21

Here I demonstrate a dirt-cheap method of transmitting data from any microchip to any PC using $3.21 in parts. I've had this idea for a while, but finally got it working tonight. On the transmit side, I'm having a an ATMEL AVR microcontroller (ATMega48) transmit data (every number from 0 to 200 over and over) wirelessly using 433mhz wireless modules. The PC receives the data through the microphone port of a sound card, and a cross-platform Python script I wrote decodes the data from the audio and graphs it on the screen. I did something similar back in 2011, but it wasn't wireless, and the software wasn't nearly as robust as it is now.

This is a proof-of-concept demonstration, and part of a larger project. I think there's a need for this type of thing though! It's unnecessarily hard to transfer data from a MCU to a PC as it is. There's USB (For AVR V-USB is a nightmare and requires a precise, specific clock speed, DIP chips don't have native USB, and some PIC DIP chips do but then you have to go through driver hell), USART RS-232 over serial port works (but who has serial ports these days?), or USART over USB RS-232 interface chips (like FTDI FT-232, but surface mount only), but both also require precise, specific clock speeds. Pretend I want to just measure temperature once a minute. Do I really want to etch circuit boards and solder SMT components? Well, kinda, but I don't like feeling forced to. Some times you just want a no-nonsense way to get some numbers from your microchip to your computer. This project is a funky out-of-the-box alternative to traditional methods, and one that I hope will raise a few eyebrows.

Ultimately, I designed this project to eventually allow multiple "bursting" data transmitters to transmit on the same frequency routinely, thanks to syncing and forced-sync-loss (read on). It's part of what I'm tongue-in-cheek calling the Scott Harden RF Protocol (SH-RFP). In my goal application, I wish to have about 5 wireless temperature sensors all transmitting data to my PC. The receive side has some error checking in that it makes sure pulse sizes are intelligent and symmetrical (unlike random noise), and since each number is sent twice (with the second time being in reverse), there's another layer of error-detection. This is *NOT* a robust and accurate method to send critical data. It's a cheap way to send data. It is very range limited, and only is intended to work over a distance of ten or twenty feet. First, let's see it in action!

The RF modules are pretty simple. At 1.56 on ebay (with free shipping), they're cheap too! I won't go into detail documenting the ins and out of these things (that's done well elsewhere). Briefly, you give them +5V (VCC), 0V (GND), and flip their data pin (ATAD) on and off on the transmitter module, and the receiver module's DATA pin reflects the same state. The receiver uses a gain circuit which continuously increases gain until signal is detected, so if you're not transmitting it WILL decode noise and start flipping its output pin. Note that persistent high or low states are prone to noise too, so any protocol you use these things for should have rapid state transitions. It's also suggested that you maintain an average 50% duty cycle. These modules utilize amplitude shift keying (ASK) to transmit data wirelessly. The graphic below shows what that looks like at the RF level. Transmit and receive is improved by adding a quarter-wavelength vertical antenna to the "ANT" solder pad. At 433MHz, that is about 17cm, so I'm using a 17cm copper wire as an antenna.

Transmitting from the microcontroller is easy as pie! It’s just a matter of copying-in a few lines of C. It doesn’t rely on USART, SPI, I2C, or any other protocol. Part of why I developed this method is because I often use ATTiny44A which doesn’t have USART for serial interfacing. The “SH-RFP” is easy to implement just by adding a few lines of code. I can handle that. How does it work? I can define it simply by a few rules:

  • Pulses can be one of 3 lengths: A (0), B (1), or C (break).
  • Each pulse represents high, then low of that length.

To send a packet:

  • prime synchronization by sending ten ABCs
  • indicate we’re starting data by sending C.
  • for each number you want to send:
  • send your number bit by bit (A=0, B=1)
  • send your number bit by bit (A=1, B=0)
  • indicate number end by sending C.
  • tell PC to release the signal by sending ten Cs.

Decoding is the same thing in reverse. I use an eBay sound card at $1.29 (with free shipping) to get the signal into the PC. Synchronization is required to allow the PC to know that real data (not noise) is starting. Sending the same number twice (once with reversed bit polarity) is a proof-checking mechanisms that lets us throw-out data that isn’t accurate.

From a software side, I’m using PyAudio to collect data from the sound card, and the PythonXY distribution to handle analysis with numpy, scipy, and plotting with QwtPlot, and general GUI functionality with PyQt. I think that’s about everything.

The demonstration interface is pretty self-explanatory. The top-right shows a sample piece of data. The top left is a histogram of the number of samples of each pulse width. A clean signal should have 3 pulses (A=0, B=1, C=break). Note that you’re supposed to look at the peaks to determine the best lengths to tell the software to use to distinguish A, B, and C. This was intentionally not hard-coded because I want to rapidly switch from one microcontroller platform to another which may be operating at a different clock speed, and if all the sudden it’s running 3 times slower it will be no problem to decide on the PC side. Slick, huh? The bottom-left shows data values coming in. The bottom-right graphs those values. Rate reporting lets us know that I'm receiving over 700 good data points a second. That's pretty cool, especially considering I'm recording at 44,100 Hz.

All source code (C files for an ATMega48 and Python scripts for the GUI) can be viewed here: SHRFP project on GitHub

If you use these concepts, hardware, or ideas in your project, let me know about it! Send me an email showing me your project – I’d love to see it. Good luck!

Markdown source code last modified on January 18th, 2021
---
title: Wireless Microcontroller / PC Interface for $3.21
date: 2013-05-19 01:32:46
tags: microcontroller, old, python
---

# Wireless Microcontroller / PC Interface for $3.21

__Here I demonstrate a dirt-cheap method of transmitting data from any microchip to any PC using $3.21 in parts.  __I've had this idea for a while, but finally got it working tonight. On the transmit side, I'm having a an ATMEL AVR microcontroller (ATMega48) transmit data (every number from 0 to 200 over and over) wirelessly using 433mhz wireless modules. The PC receives the data through the microphone port of a sound card, and a cross-platform Python script I wrote decodes the data from the audio and graphs it on the screen. I [did something similar back in 2011](http://www.swharden.com/blog/2011-07-09-sound-card-microcontrollerpc-communication/), but it wasn't wireless, and the software wasn't nearly as robust as it is now.

__This is a proof-of-concept demonstration, and part of a larger project.__ I think there's a need for this type of thing though! It's unnecessarily hard to transfer data from a MCU to a PC as it is. There's USB (For AVR [V-USB](http://www.obdev.at/products/vusb/index.html) is a nightmare and requires a precise, specific clock speed, DIP chips don't have native USB, and some PIC DIP chips do but then you have to go through driver hell), [USART RS-232 over serial port](http://www.swharden.com/blog/2009-05-14-simple-case-avrpc-serial-communication-via-max232/) works (but who has serial ports these days?), or USART over USB RS-232 interface chips (like [FTDI FT-232](http://www.ftdichip.com/Products/ICs/FT232R.htm), but surface mount only), but both also require precise, specific clock speeds. Pretend I want to just measure temperature once a minute. Do I _really_ want to etch circuit boards and solder SMT components? Well, kinda, but I don't like feeling forced to. Some times you just want a no-nonsense way to get some numbers from your microchip to your computer. This project is a funky out-of-the-box alternative to traditional methods, and one that I hope will raise a few eyebrows.

<div class="text-center img-border">

[![](c31_thumb.jpg)](c31.jpg)

</div>

__Ultimately, I designed this project to eventually allow multiple "bursting" data transmitters to transmit on the same frequency__ __routinely__, thanks to syncing and forced-sync-loss (read on). It's part of what I'm tongue-in-cheek calling the _Scott Harden RF Protocol_ (SH-RFP). In my goal application, I wish to have about 5 wireless temperature sensors all transmitting data to my PC.  The receive side has some error checking in that it makes sure pulse sizes are intelligent and symmetrical (unlike random noise), and since each number is sent twice (with the second time being in reverse), there's another layer of error-detection.  This is \*NOT\* a robust and accurate method to send critical data. It's a cheap way to send data. It is very range limited, and only is intended to work over a distance of ten or twenty feet. First, let's see it in action!

![](https://www.youtube.com/embed/GJHFldPwZvM)

__The RF modules are pretty simple. [At 1.56 on ebay](http://www.ebay.com/itm/KDQ11-NEW-1PCS-433MHZ-RF-TRANSMITTER-AND-RECEIVER-LINK-KIT-FOR-ARDUINO-SCA-1710-/350797631746?pt=LH_DefaultDomain_0&hash=item51ad2b1102) (with free shipping), they're cheap too!__ I won't go into detail documenting the ins and out of these things (that's done well elsewhere). Briefly, you give them +5V (VCC), 0V (GND), and flip their data pin (ATAD) on and off on the transmitter module, and the receiver module's DATA pin reflects the same state. The receiver uses a gain circuit which continuously increases gain until signal is detected, so if you're not transmitting it WILL decode noise and start flipping its output pin. Note that persistent high or low states are prone to noise too, so any protocol you use these things for should have rapid state transitions. It's also suggested that you maintain an average 50% duty cycle. These modules utilize [amplitude shift keying](http://en.wikipedia.org/wiki/Amplitude-shift_keying) (ASK) to transmit data wirelessly. The graphic below shows what that looks like at the RF level. Transmit and receive is improved by adding a quarter-wavelength vertical antenna to the "ANT" solder pad. At 433MHz, that is about 17cm, so I'm using a 17cm copper wire as an antenna.

__Transmitting from the microcontroller is easy as pie!__ It’s just a matter of copying-in a few lines of C.  It doesn’t rely on USART, SPI, I2C, or any other protocol. Part of why I developed this method is because I often use ATTiny44A which doesn’t have USART for serial interfacing. The “SH-RFP” is easy to implement just by adding a few lines of code. I can handle that.  How does it work? I can define it simply by a few rules:

*   Pulses can be one of 3 lengths: A (0), B (1), or C (break).
*   Each pulse represents high, then low of that length.

To send a packet:

*   prime synchronization by sending ten ABCs
*   indicate we’re starting data by sending C.
*   for each number you want to send:
  *   send your number bit by bit (A=0, B=1)
  *   send your number bit by bit (A=1, B=0)
  *   indicate number end by sending C.
*   tell PC to release the signal by sending ten Cs.

Decoding is the same thing in reverse. I use an [eBay sound card at $1.29](search.ebay.com/usb-sound-card) (with free shipping) to get the signal into the PC. </span> Synchronization is required to allow the PC to know that real data (not noise) is starting. Sending the same number twice (once with reversed bit polarity) is a proof-checking mechanisms that lets us throw-out data that isn’t accurate.

__From a software side,__ I’m using PyAudio to collect data from the sound card, and the PythonXY distribution to handle analysis with numpy, scipy, and plotting with QwtPlot, and general GUI functionality with PyQt. I think that’s about everything.

<div class="text-center img-border">

[![](SHRFP_thumb.jpg)](SHRFP.png)

</div>

__The demonstration interface is pretty self-explanatory.__ The top-right shows a sample piece of data. The top left is a histogram of the number of samples of each pulse width. A clean signal should have 3 pulses (A=0, B=1, C=break). Note that you’re supposed to look at the peaks to determine the best lengths to tell the software to use to distinguish A, B, and C. This was intentionally not hard-coded because I want to rapidly switch from one microcontroller platform to another which may be operating at a different clock speed, and if all the sudden it’s running 3 times slower it will be no problem to decide on the PC side. Slick, huh? The bottom-left shows data values coming in. The bottom-right graphs those values. Rate reporting lets us know that I'm receiving over 700 good data points a second. That's pretty cool, especially considering I'm recording at 44,100 Hz.

All source code (C files for an ATMega48 and Python scripts for the GUI) can be viewed here: [SHRFP project on GitHub](https://github.com/swharden/AVR-projects/tree/master/ATMega48%202013-05-14%20SHRFP%20monitor)

If you use these concepts, hardware, or ideas in your project, let me know about it! Send me an email showing me your project – I’d love to see it. Good luck!

May 9th, 2013

Realtime FFT Audio Visualization with Python

WARNING: this project is largely outdated, and some of the modules are no longer supported by modern distributions of Python.For a more modern, cleaner, and more complete GUI-based viewer of realtime audio data (and the FFT frequency data), check out my Python Real-time Audio Frequency Monitor project.

I'm no stranger to visualizing linear data in the frequency-domain. Between the high definition spectrograph suite I wrote in my first year of dental school (QRSS-VD, which differentiates tones to sub-Hz resolution), to the various scripts over the years (which go into FFT imaginary number theory, linear data signal filtering with python, and real time audio graphing with wckgraph), I've tried dozens of combinations of techniques to capture data, analyze it, and display it with Python. Because I'm now branching into making microcontroller devices which measure and transfer analog data to a computer, I need a way to rapidly visualize data obtained in Python. Since my microcontroller device isn't up and running yet, linear data from a PC microphone will have to do. Here's a quick and dirty start-to-finish project anyone can tease apart to figure out how to do some of these not-so-intuitive processes in Python. To my knowledge, this is a cross-platform solution too. For the sound card interaction, it relies on the cross-platform sound card interface library PyAudio. My python distro is 2.7 (python xy), but pythonxy doesn't [yet] supply PyAudio.

The code behind it is a little jumbled, but it works. For recording, I wrote a class "SwhRecorder" which uses threading to continuously record audio and save it as a numpy array. When the class is loaded and started, your GUI can wait until it sees newAudio become True, then it can grab audio directly, or use fft() to pull the spectral component (which is what I do in the video). Note that my fft() relies on numpy.fft.fft(). The return is a nearly-symmetrical mirror image of the frequency components, which (get ready to cringe mathematicians) I simply split into two arrays, reverse one of them, and add together. To turn this absolute value into dB, I'd take the log10(fft) and multiply it by 20. You know, if you're into that kind of thing, you should really check out a post I made about FFT theory and analyzing audio data in python.

Here's the meat of the code. To run it, you should really grab the zip file at the bottom of the page. I'll start with the recorder class:

import matplotlib
matplotlib.use('TkAgg') # THIS MAKES IT FAST!
import numpy
import scipy
import struct
import pyaudio
import threading
import pylab
import struct

class SwhRecorder:
    """Simple, cross-platform class to record from the microphone."""

    def __init__(self):
        """minimal garb is executed when class is loaded."""
        self.RATE=48100
        self.BUFFERSIZE=2**12 #1024 is a good buffer size
        self.secToRecord=.1
        self.threadsDieNow=False
        self.newAudio=False

    def setup(self):
        """initialize sound card."""
        #TODO - windows detection vs. alsa or something for linux
        #TODO - try/except for sound card selection/initiation

        self.buffersToRecord=int(self.RATE*self.secToRecord/self.BUFFERSIZE)
        if self.buffersToRecord==0: self.buffersToRecord=1
        self.samplesToRecord=int(self.BUFFERSIZE*self.buffersToRecord)
        self.chunksToRecord=int(self.samplesToRecord/self.BUFFERSIZE)
        self.secPerPoint=1.0/self.RATE

        self.p = pyaudio.PyAudio()
        self.inStream = self.p.open(format=pyaudio.paInt16,channels=1,
            rate=self.RATE,input=True,frames_per_buffer=self.BUFFERSIZE)
        self.xsBuffer=numpy.arange(self.BUFFERSIZE)*self.secPerPoint
        self.xs=numpy.arange(self.chunksToRecord*self.BUFFERSIZE)*self.secPerPoint
        self.audio=numpy.empty((self.chunksToRecord*self.BUFFERSIZE),dtype=numpy.int16)

    def close(self):
        """cleanly back out and release sound card."""
        self.p.close(self.inStream)

    ### RECORDING AUDIO ###

    def getAudio(self):
        """get a single buffer size worth of audio."""
        audioString=self.inStream.read(self.BUFFERSIZE)
        return numpy.fromstring(audioString,dtype=numpy.int16)

    def record(self,forever=True):
        """record secToRecord seconds of audio."""
        while True:
            if self.threadsDieNow: break
            for i in range(self.chunksToRecord):
                self.audio[i*self.BUFFERSIZE:(i+1)*self.BUFFERSIZE]=self.getAudio()
            self.newAudio=True
            if forever==False: break

    def continuousStart(self):
        """CALL THIS to start running forever."""
        self.t = threading.Thread(target=self.record)
        self.t.start()

    def continuousEnd(self):
        """shut down continuous recording."""
        self.threadsDieNow=True

    ### MATH ###

    def downsample(self,data,mult):
        """Given 1D data, return the binned average."""
        overhang=len(data)%mult
        if overhang: data=data[:-overhang]
        data=numpy.reshape(data,(len(data)/mult,mult))
        data=numpy.average(data,1)
        return data

    def fft(self,data=None,trimBy=10,logScale=False,divBy=100):
        if data==None:
            data=self.audio.flatten()
        left,right=numpy.split(numpy.abs(numpy.fft.fft(data)),2)
        ys=numpy.add(left,right[::-1])
        if logScale:
            ys=numpy.multiply(20,numpy.log10(ys))
        xs=numpy.arange(self.BUFFERSIZE/2,dtype=float)
        if trimBy:
            i=int((self.BUFFERSIZE/2)/trimBy)
            ys=ys[:i]
            xs=xs[:i]*self.RATE/self.BUFFERSIZE
        if divBy:
            ys=ys/float(divBy)
        return xs,ys

    ### VISUALIZATION ###

    def plotAudio(self):
        """open a matplotlib popup window showing audio data."""
        pylab.plot(self.audio.flatten())
        pylab.show()

And now here's the GUI launcher:

import ui_plot
import sys
import numpy
from PyQt4 import QtCore, QtGui
import PyQt4.Qwt5 as Qwt
from recorder import *

def plotSomething():
    if SR.newAudio==False:
        return
    xs,ys=SR.fft()
    c.setData(xs,ys)
    uiplot.qwtPlot.replot()
    SR.newAudio=False

if __name__ == "__main__":
    app = QtGui.QApplication(sys.argv)

    win_plot = ui_plot.QtGui.QMainWindow()
    uiplot = ui_plot.Ui_win_plot()
    uiplot.setupUi(win_plot)
    uiplot.btnA.clicked.connect(plotSomething)
    #uiplot.btnB.clicked.connect(lambda: uiplot.timer.setInterval(100.0))
    #uiplot.btnC.clicked.connect(lambda: uiplot.timer.setInterval(10.0))
    #uiplot.btnD.clicked.connect(lambda: uiplot.timer.setInterval(1.0))
    c=Qwt.QwtPlotCurve()
    c.attach(uiplot.qwtPlot)

    uiplot.qwtPlot.setAxisScale(uiplot.qwtPlot.yLeft, 0, 1000)

    uiplot.timer = QtCore.QTimer()
    uiplot.timer.start(1.0)

    win_plot.connect(uiplot.timer, QtCore.SIGNAL('timeout()'), plotSomething)

    SR=SwhRecorder()
    SR.setup()
    SR.continuousStart()

    ### DISPLAY WINDOWS
    win_plot.show()
    code=app.exec_()
    SR.close()
    sys.exit(code)

Note that by commenting-out the FFT line and using "c.setData(SR.xs,SR.audio)" you can plot linear PCM data to visualize sound waves like this:

Download Source Code

Finally, here’s the zip file. It contains everything you need to run the program on your own computer (including the UI scripts which are not written on this page)

DOWNLOAD: SWHRecorder.zip

If you make a cool project based on this one, I'd love to hear about it. Good luck!

Markdown source code last modified on January 18th, 2021
---
title: Realtime FFT Audio Visualization with Python
date: 2013-05-09 19:52:02
tags: python, old
---

# Realtime FFT Audio Visualization with Python

>  WARNING: this project is largely outdated, and some of the modules are no longer supported by modern distributions of Python.For a more modern, cleaner, and more complete GUI-based viewer of realtime audio data (and the FFT frequency data), check out my [Python Real-time Audio Frequency Monitor](https://www.swharden.com/wp/2016-07-31-real-time-audio-monitor-with-pyqt/) project.

__I'm no stranger to visualizing linear data in the frequency-domain.__ Between the high definition spectrograph suite I wrote in my first year of dental school ([QRSS-VD](http://www.swharden.com/blog/qrss_vd/), which differentiates tones to sub-Hz resolution), to the various scripts over the years (which go into [FFT imaginary number theory](http://www.swharden.com/blog/2010-06-23-insights-into-ffts-imaginary-numbers-and-accurate-spectrographs/), linear data [signal filtering with python](http://www.swharden.com/blog/2009-01-21-signal-filtering-with-python/), and real time [audio graphing with wckgraph](http://www.swharden.com/blog/2010-03-05-realtime-fft-graph-of-audio-wav-file-or-microphone-input-with-python-scipy-and-wckgraph/)), I've tried dozens of combinations of techniques to capture data, analyze it, and display it with Python. Because I'm now branching into making microcontroller devices which measure and transfer analog data to a computer, I need a way to rapidly visualize data obtained in Python. Since my microcontroller device isn't up and running yet, linear data from a PC microphone will have to do.  Here's a quick and dirty start-to-finish project anyone can tease apart to figure out how to do some of these not-so-intuitive processes in Python. To my knowledge, this is a cross-platform solution too. For the sound card interaction, it relies on the cross-platform sound card interface library [PyAudio](http://people.csail.mit.edu/hubert/pyaudio/). My python distro is 2.7 ([python xy](https://code.google.com/p/pythonxy/)), but pythonxy doesn't [yet] supply PyAudio.


<div class="text-center img-border">

[![](realtime-fft-spectrum-python-pyqwt-graph_thumb.jpg)](realtime-fft-spectrum-python-pyqwt-graph.png)

</div>

__The code behind it is a little jumbled, but it works.__ For recording, I wrote a class "SwhRecorder" which uses threading to continuously record audio and save it as a numpy array. When the class is loaded and started, your GUI can wait until it sees _newAudio_ become _True_, then it can grab _audio_ directly, or use fft() to pull the spectral component (which is what I do in the video). Note that my fft() relies on [numpy.fft.fft()](http://docs.scipy.org/doc/numpy/reference/generated/numpy.fft.fft.html). The return is a nearly-symmetrical mirror image of the frequency components, which (get ready to cringe mathematicians) I simply split into two arrays, reverse one of them, and add together. To turn this absolute value into dB, I'd take the log10(fft) and multiply it by 20. You know, if you're into that kind of thing, you should really check out a [post I made about FFT theory and analyzing audio data in python](http://www.swharden.com/blog/2010-06-23-insights-into-ffts-imaginary-numbers-and-accurate-spectrographs/).

![](https://www.youtube.com/embed/vQ1e47VXxZg)

__Here's the meat of the code.__ To run it, you should really grab the zip file at the bottom of the page. I'll start with the recorder class:

```python
import matplotlib
matplotlib.use('TkAgg') # THIS MAKES IT FAST!
import numpy
import scipy
import struct
import pyaudio
import threading
import pylab
import struct

class SwhRecorder:
    """Simple, cross-platform class to record from the microphone."""

    def __init__(self):
        """minimal garb is executed when class is loaded."""
        self.RATE=48100
        self.BUFFERSIZE=2**12 #1024 is a good buffer size
        self.secToRecord=.1
        self.threadsDieNow=False
        self.newAudio=False

    def setup(self):
        """initialize sound card."""
        #TODO - windows detection vs. alsa or something for linux
        #TODO - try/except for sound card selection/initiation

        self.buffersToRecord=int(self.RATE*self.secToRecord/self.BUFFERSIZE)
        if self.buffersToRecord==0: self.buffersToRecord=1
        self.samplesToRecord=int(self.BUFFERSIZE*self.buffersToRecord)
        self.chunksToRecord=int(self.samplesToRecord/self.BUFFERSIZE)
        self.secPerPoint=1.0/self.RATE

        self.p = pyaudio.PyAudio()
        self.inStream = self.p.open(format=pyaudio.paInt16,channels=1,
            rate=self.RATE,input=True,frames_per_buffer=self.BUFFERSIZE)
        self.xsBuffer=numpy.arange(self.BUFFERSIZE)*self.secPerPoint
        self.xs=numpy.arange(self.chunksToRecord*self.BUFFERSIZE)*self.secPerPoint
        self.audio=numpy.empty((self.chunksToRecord*self.BUFFERSIZE),dtype=numpy.int16)

    def close(self):
        """cleanly back out and release sound card."""
        self.p.close(self.inStream)

    ### RECORDING AUDIO ###

    def getAudio(self):
        """get a single buffer size worth of audio."""
        audioString=self.inStream.read(self.BUFFERSIZE)
        return numpy.fromstring(audioString,dtype=numpy.int16)

    def record(self,forever=True):
        """record secToRecord seconds of audio."""
        while True:
            if self.threadsDieNow: break
            for i in range(self.chunksToRecord):
                self.audio[i*self.BUFFERSIZE:(i+1)*self.BUFFERSIZE]=self.getAudio()
            self.newAudio=True
            if forever==False: break

    def continuousStart(self):
        """CALL THIS to start running forever."""
        self.t = threading.Thread(target=self.record)
        self.t.start()

    def continuousEnd(self):
        """shut down continuous recording."""
        self.threadsDieNow=True

    ### MATH ###

    def downsample(self,data,mult):
        """Given 1D data, return the binned average."""
        overhang=len(data)%mult
        if overhang: data=data[:-overhang]
        data=numpy.reshape(data,(len(data)/mult,mult))
        data=numpy.average(data,1)
        return data

    def fft(self,data=None,trimBy=10,logScale=False,divBy=100):
        if data==None:
            data=self.audio.flatten()
        left,right=numpy.split(numpy.abs(numpy.fft.fft(data)),2)
        ys=numpy.add(left,right[::-1])
        if logScale:
            ys=numpy.multiply(20,numpy.log10(ys))
        xs=numpy.arange(self.BUFFERSIZE/2,dtype=float)
        if trimBy:
            i=int((self.BUFFERSIZE/2)/trimBy)
            ys=ys[:i]
            xs=xs[:i]*self.RATE/self.BUFFERSIZE
        if divBy:
            ys=ys/float(divBy)
        return xs,ys

    ### VISUALIZATION ###

    def plotAudio(self):
        """open a matplotlib popup window showing audio data."""
        pylab.plot(self.audio.flatten())
        pylab.show()
```

__And now here's the GUI launcher:__

```python
import ui_plot
import sys
import numpy
from PyQt4 import QtCore, QtGui
import PyQt4.Qwt5 as Qwt
from recorder import *

def plotSomething():
    if SR.newAudio==False:
        return
    xs,ys=SR.fft()
    c.setData(xs,ys)
    uiplot.qwtPlot.replot()
    SR.newAudio=False

if __name__ == "__main__":
    app = QtGui.QApplication(sys.argv)

    win_plot = ui_plot.QtGui.QMainWindow()
    uiplot = ui_plot.Ui_win_plot()
    uiplot.setupUi(win_plot)
    uiplot.btnA.clicked.connect(plotSomething)
    #uiplot.btnB.clicked.connect(lambda: uiplot.timer.setInterval(100.0))
    #uiplot.btnC.clicked.connect(lambda: uiplot.timer.setInterval(10.0))
    #uiplot.btnD.clicked.connect(lambda: uiplot.timer.setInterval(1.0))
    c=Qwt.QwtPlotCurve()
    c.attach(uiplot.qwtPlot)

    uiplot.qwtPlot.setAxisScale(uiplot.qwtPlot.yLeft, 0, 1000)

    uiplot.timer = QtCore.QTimer()
    uiplot.timer.start(1.0)

    win_plot.connect(uiplot.timer, QtCore.SIGNAL('timeout()'), plotSomething)

    SR=SwhRecorder()
    SR.setup()
    SR.continuousStart()

    ### DISPLAY WINDOWS
    win_plot.show()
    code=app.exec_()
    SR.close()
    sys.exit(code)
```

_Note that by commenting-out the FFT line and using "c.setData(SR.xs,SR.audio)" you can plot linear PCM data to visualize sound waves like this:_

<div class="text-center img-border">

[![](pcm_thumb.jpg)](pcm.png)

</div>

### Download Source Code

Finally, here’s the zip file. It contains everything you need to run the program on your own computer (including the UI scripts which are not written on this page)

**DOWNLOAD:** [SWHRecorder.zip](SWHRecorder.zip)

_If you make a cool project based on this one, I'd love to hear about it. Good luck!_

May 8th, 2013

Realtime Data Plotting in Python

WARNING: this project is largely outdated, and some of the modules are no longer supported by modern distributions of Python.For a more modern, cleaner, and more complete GUI-based viewer of realtime audio data (and the FFT frequency data), check out my Python Real-time Audio Frequency Monitor project. I love using python for handing data. Displaying it isn't always as easy. Python fast to write, and numpy, scipy, and matplotlib are an incredible combination. I love matplotlib for displaying data and use it all the time, but when it comes to realtime data visualization, matplotlib (admittedly) falls behind. Imagine trying to plot sound waves in real time. Matplotlib simply can't handle it. I've recently been making progress toward this end with PyQwt with the Python X,Y distribution. It is a cross-platform solution which should perform identically on Windows, Linux, and MacOS. Here's an example of what it looks like plotting some dummy data (a sine wave) being transformed with numpy.roll().

How did I do it? Easy. First, I made the GUI with QtDesigner (which comes with Python x,y). I saved the GUI as a .ui file. I then used the pyuic4 command to generate a python script from the .ui file. In reality, I use a little helper script I wrote designed to build .py files from .ui files and start a little "ui.py" file which imports all of the ui classes. It's overkill for this, but I'll put it in the ZIP anyway. Here's what the GUI looks like in QtDesigner:

After that, I tie everything together in a little script which updates the plot in real time. It takes inputs from button click events and tells a clock (QTimer) how often to update/replot the data. Replotting it involves just rolling it with numpy.roll(). Check it out:

import ui_plot #this was generated by pyuic4 command
import sys
import numpy
from PyQt4 import QtCore, QtGui
import PyQt4.Qwt5 as Qwt

numPoints=1000
xs=numpy.arange(numPoints)
ys=numpy.sin(3.14159*xs*10/numPoints) #this is our data

def plotSomething():
    global ys
    ys=numpy.roll(ys,-1)
    c.setData(xs, ys)
    uiplot.qwtPlot.replot()

if __name__ == "__main__":
    app = QtGui.QApplication(sys.argv)
    win_plot = ui_plot.QtGui.QMainWindow()
    uiplot = ui_plot.Ui_win_plot()
    uiplot.setupUi(win_plot)

    # tell buttons what to do when clicked
    uiplot.btnA.clicked.connect(plotSomething)
    uiplot.btnB.clicked.connect(lambda: uiplot.timer.setInterval(100.0))
    uiplot.btnC.clicked.connect(lambda: uiplot.timer.setInterval(10.0))
    uiplot.btnD.clicked.connect(lambda: uiplot.timer.setInterval(1.0))

    # set up the QwtPlot (pay attention!)
    c=Qwt.QwtPlotCurve()  #make a curve
    c.attach(uiplot.qwtPlot) #attach it to the qwtPlot object
    uiplot.timer = QtCore.QTimer() #start a timer (to call replot events)
    uiplot.timer.start(100.0) #set the interval (in ms)
    win_plot.connect(uiplot.timer, QtCore.SIGNAL('timeout()'), plotSomething)

    # show the main window
    win_plot.show()
    sys.exit(app.exec_())
Markdown source code last modified on January 18th, 2021
---
title: Realtime Data Plotting in Python
date: 2013-05-08 16:34:27
tags: python, old
---

# Realtime Data Plotting in Python

>  WARNING: this project is largely outdated, and some of the modules are no longer supported by modern distributions of Python.For a more modern, cleaner, and more complete GUI-based viewer of realtime audio data (and the FFT frequency data), check out my [Python Real-time Audio Frequency Monitor](https://www.swharden.com/wp/2016-07-31-real-time-audio-monitor-with-pyqt/) project.
__I love using python for handing data. Displaying it isn't always as easy.__ Python fast to write, and numpy, scipy, and matplotlib are an incredible combination. I love matplotlib for displaying data and [use it all the time](http://swharden.com/blog/matplotlib), but when it comes to realtime data visualization, matplotlib (admittedly) falls behind. Imagine trying to plot sound waves in real time. Matplotlib simply can't handle it. I've recently been making progress toward this end with PyQwt with the [Python X,Y](https://code.google.com/p/pythonxy/) distribution. It is a cross-platform solution which should perform identically on Windows, Linux, and MacOS. Here's an example of what it looks like plotting some dummy data (a sine wave) being transformed with numpy.roll().

<div class="text-center img-border">

![](f.gif)

</div>

__How did I do it?__ Easy. First, I made the GUI with [QtDesigner](http://qt-project.org/doc/qt-4.8/designer-manual.html) (which comes with Python x,y). I saved the GUI as a .ui file. I then used the pyuic4 command to generate a python script from the .ui file. In reality, I use a little helper script I wrote designed to build .py files from .ui files and start a little "ui.py" file which imports all of the ui classes. It's overkill for this, but I'll put it in the ZIP anyway.  Here's what the GUI looks like in QtDesigner:

<div class="text-center img-border">

[![](qtdesigner-python-windows-qwtplot_thumb.jpg)](qtdesigner-python-windows-qwtplot.png)

</div>

__After that, I tie everything together in a little script which updates the plot in real time.__ It takes inputs from button click events and tells a clock (QTimer) how often to update/replot the data. Replotting it involves just rolling it with [numpy.roll()](http://docs.scipy.org/doc/numpy/reference/generated/numpy.roll.html).  Check it out:

```python
import ui_plot #this was generated by pyuic4 command
import sys
import numpy
from PyQt4 import QtCore, QtGui
import PyQt4.Qwt5 as Qwt

numPoints=1000
xs=numpy.arange(numPoints)
ys=numpy.sin(3.14159*xs*10/numPoints) #this is our data

def plotSomething():
    global ys
    ys=numpy.roll(ys,-1)
    c.setData(xs, ys)
    uiplot.qwtPlot.replot()

if __name__ == "__main__":
    app = QtGui.QApplication(sys.argv)
    win_plot = ui_plot.QtGui.QMainWindow()
    uiplot = ui_plot.Ui_win_plot()
    uiplot.setupUi(win_plot)

    # tell buttons what to do when clicked
    uiplot.btnA.clicked.connect(plotSomething)
    uiplot.btnB.clicked.connect(lambda: uiplot.timer.setInterval(100.0))
    uiplot.btnC.clicked.connect(lambda: uiplot.timer.setInterval(10.0))
    uiplot.btnD.clicked.connect(lambda: uiplot.timer.setInterval(1.0))

    # set up the QwtPlot (pay attention!)
    c=Qwt.QwtPlotCurve()  #make a curve
    c.attach(uiplot.qwtPlot) #attach it to the qwtPlot object
    uiplot.timer = QtCore.QTimer() #start a timer (to call replot events)
    uiplot.timer.start(100.0) #set the interval (in ms)
    win_plot.connect(uiplot.timer, QtCore.SIGNAL('timeout()'), plotSomething)

    # show the main window
    win_plot.show()
    sys.exit(app.exec_())
```
Pages