Sometimes it’s tempting to re-invent the wheel to make a device function exactly the way you want. I am re-visiting the field of homemade electrophysiology equipment, and although I’ve already published a home made electocardiograph (ECG), I wish to revisit that project and make it much more elegant, while also planning for a pulse oximeter, an electroencephalograph (EEG), and an electrogastrogram (EGG). This project is divided into 3 major components: the low-noise microvoltage amplifier, a digital analog to digital converter with PC connectivity, and software to display and analyze the traces. My first challenge is to create that middle step, a device to read voltage (from 0-5V) and send this data to a computer.

This project demonstrates a simple solution for the frustrating problem of sending data from a microcontroller to a PC with a USB connection. My solution utilizes a USB FTDI serial-to-usb cable, allowing me to simply put header pins on my device which I can plug into providing the microcontroller-computer link. This avoids the need for soldering surface-mount FTDI chips (which gets expensive if you put one in every project). FTDI cables are inexpensive (about $11 shipped on eBay) and I’ve gotten a lot of mileage out of mine and know I will continue to use it for future projects. If you are interested in MCU/PC communication, consider one of these cables as a rapid development prototyping tool. I’m certainly enjoying mine!

It is important to me that my design is minimalistic, inexpensive, and functions natively on Linux and Windows without installing special driver-related software, and can be visualized in real-time using native Python libraries, such that the same code can be executed identically on all operating systems with minimal computer-side configuration. I’d say I succeeded in this effort, and while the project could use some small touches to polish it up, it’s already solid and proven in its usefulness and functionality.

This is my final device. It’s reading voltage on a single pin, sending this data to a computer through a USB connection, and custom software (written entirely in Python, designed to be a cross-platform solution) displays the signal in real time. Although it’s capable of recording and displaying 5 channels at the same time, it’s demonstrated displaying only one. Let’s check-out a video of it in action:

This 5-channel realtime USB analog sensor, coupled with custom cross-platform open-source software, will serve as the foundation for a slew of electrophysiological experiments, but can also be easily expanded to serve as an inexpensive multichannel digital oscilloscope. While more advanced solutions exist, this has the advantage of being minimally complex (consisting of a single microchip), inexpensive, and easy to build.

 To the right is my working environment during the development of this project. You can see electronics, my computer, microchips, and coffee, but an intriguingly odd array of immunological posters in the background. I spent a couple weeks camping-out in a molecular biology laboratory here at UF and got a lot of work done, part of which involved diving into electronics again. At the time this photo was taken, I hadn’t worked much at my home workstation. It’s a cool picture, so I’m holding onto it.

Below is a simplified description of the circuit schematic that is employed in this project. Note that there are 6 ADC (analog to digital converter) inputs on the ATMega48 IC, but for whatever reason I ended-up only hard-coding 5 into the software. Eventually I’ll go back and re-declare this project a 6-channel sensor, but since I don’t have six things to measure at the moment I’m fine keeping it the way it is. RST, SCK, MISO, and MOSI are used to program the microcontroller and do not need to be connected to anything for operation. The max232 was initially used as a level converter to allow the micro-controller to communicate with a PC via the serial port. However, shortly after this project was devised an upgrade was used to allow it to connect via USB. Continue reading for details…

Below you can see the circuit breadboarded. The potentiometer (small blue box) simulated an analog input signal.

The lower board is my AVR programmer, and is connected to RST, SCK, MISO, MOSI, and GND to allow me to write code on my laptop and program the board. It’s a Fun4DIY.com AVR programmer which can be yours for $11 shipped! I’m not affiliated with their company, but I love that little board. It’s a clone of the AVR ISP MK-II.

As you can see, the USB AVR programmer I’m using is supported in Linux. I did all of my development in Ubuntu Linux, writing AVR-GCC (C) code in my favorite Linux code editor Geany, then loaded the code onto the chip with AVRDude.

I found a simple way to add USB functionality in a standard, reproducible way that works without requiring the soldering of a SMT FTDI chip, and avoids custom libraries like V-USB which don’t easily have drivers that are supported by major operating systems (Windows) without special software. I understand that the simplest long-term and commercially-logical solution would be to use that SMT chip, but I didn’t feel like dealing with it. Instead, I added header pins which allow me to snap-on a pre-made FTDI USB cable. They’re a bit expensive ($12 on ebay) but all I need is 1 and I can use it in all my projects since it’s a sinch to connect and disconnect. Beside, it supplies power to the target board! It’s supported in Linux and in Windows with established drivers that are shipped with the operating system. It’s a bit of a shortcut, but I like this solution. It also eliminates the need for the max232 chip, since it can sense the voltages outputted by the microcontroller directly.

The system works by individually reading the 10-bit ADC pins on the microcontroller (providing values from 0-1024 to represent voltage from 0-5V or 0-1.1V depending on how the code is written), converting these values to text, and sending them as a string via the serial protocol. The FTDI cable reads these values and transmits them to the PC through a USB connection, which looks like “COM5” on my Windows computer. Values can be seen in any serial terminal program (i.e., hyperterminal), or accessed through Python with the PySerial module.

As you can see, I’m getting quite good at home-brewn PCBs. While it would be fantastic to design a board and have it made professionally, this is expensive and takes some time. In my case, I only have a few hours here or there to work on projects. If I have time to design a board, I want it made immediately! I can make this start to finish in about an hour. I use a classic toner transfer method with ferric chloride, and a dremel drill press to create the holes. I haven’t attacked single-layer SMT designs yet, but I can see its convenience, and look forward to giving it a shot before too long.

Here’s the final board ready for digitally reporting analog voltages. You can see 3 small headers on the far left and 2 at the top of the chip. These are for RST, SCK, MISO, MOSI, and GND for programming the chip. Once it’s programmed, it doesn’t need to be programmed again. Although I wrote the code for an ATMega48, it works fine on a pin-compatible ATMega8 which is pictured here. The connector at the top is that FTDI USB cable, and it supplies power and USB serial connectivity to the board.

If you look closely, you can see that modified code has been loaded on this board with a Linux laptop. This thing is an exciting little board, because it has so many possibilities. It could read voltages of a single channel in extremely high speed and send that data continuously, or it could read from many channels and send it at any rate, or even cooler would be to add some bidirectional serial communication capabilities to allow the computer to tell the microcontroller which channels to read and how often to report the values back. There is a lot of potential for this little design, and I’m glad I have it working.

Unfortunately I lost the schematics to this device because I formatted the computer that had the Eagle files on it. It should be simple and intuitive enough to be able to design again. The code for the microcontroller and code for the real-time visualization software will be posted below shortly. Below are some videos of this board in use in one form or another:

Here is the code that is loaded onto the microcontroller:

#define F_CPU 8000000UL
#include <avr/io.h>
#include <util/delay.h>

void readADC(char adcn){
		//ADMUX = 0b0100000+adcn; // AVCC ref on ADCn
		ADMUX = 0b1100000+adcn; // AVCC ref on ADCn
		ADCSRA |= (1<<ADSC); // reset value
        while (ADCSRA & (1<<ADSC)) {}; // wait for measurement
}

int main (void){
    DDRD=255;
	init_usart();
    ADCSRA = 0b10000111; //ADC Enable, Manual Trigger, Prescaler
    ADCSRB = 0;

    int adcs[8]={0,0,0,0,0,0,0,0};

    char i=0;
	for(;;){
		for (i=0;i<8;i++){readADC(i);adcs[i]=ADC>>6;}
		for (i=0;i<5;i++){sendNum(adcs[i]);send(44);}
		readADC(0);
		send(10);// LINE BREAK
		send(13); //return
		_delay_ms(3);_delay_ms(5);
	}
}

void sendNum(unsigned int num){
	char theIntAsString[7];
	int i;
	sprintf(theIntAsString, "%u", num);
	for (i=0; i < strlen(theIntAsString); i++){
		send(theIntAsString[i]);
	}
}


void send (unsigned char c){
	while((UCSR0A & (1<<UDRE0)) == 0) {}
	UDR0 = c;
}

void init_usart () {
	// ATMEGA48 SETTINGS
	int BAUD_PRESCALE = 12;
	UBRR0L = BAUD_PRESCALE; // Load lower 8-bits
	UBRR0H = (BAUD_PRESCALE >> 8); // Load upper 8-bits
	UCSR0A = 0;
	UCSR0B = (1<<RXEN0)|(1<<TXEN0); //rx and tx
	UCSR0C = (1<<UCSZ01) | (1<<UCSZ00); //We want 8 data bits
}

Here is the code that runs on the computer, allowing reading and real-time graphing of the serial data. It’s written in Python and has been tested in both Linux and Windows. It requires *NO* non-standard python libraries, making it very easy to distribute. Graphs are drawn (somewhat inefficiently) using lines in TK. Subsequent development went into improving the visualization, and drastic improvements have been made since this code was written, and updated code will be shared shortly. This is functional, so it’s worth sharing.

import Tkinter, random, time
import socket, sys, serial

class App:

	def white(self):
		self.lines=[]
		self.lastpos=0

		self.c.create_rectangle(0, 0, 800, 512, fill="black")
		for y in range(0,512,50):
			self.c.create_line(0, y, 800, y, fill="#333333",dash=(4, 4))
			self.c.create_text(5, y-10, fill="#999999", text=str(y*2), anchor="w")
		for x in range(100,800,100):
			self.c.create_line(x, 0, x, 512, fill="#333333",dash=(4, 4))
			self.c.create_text(x+3, 500-10, fill="#999999", text=str(x/100)+"s", anchor="w")

		self.lineRedraw=self.c.create_line(0, 800, 0, 0, fill="red")

		self.lines1text=self.c.create_text(800-3, 10, fill="#00FF00", text=str("TEST"), anchor="e")
		for x in range(800):
			self.lines.append(self.c.create_line(x, 0, x, 0, fill="#00FF00"))

	def addPoint(self,val):
		self.data[self.xpos]=val
		self.line1avg+=val
		if self.xpos%10==0:
			self.c.itemconfig(self.lines1text,text=str(self.line1avg/10.0))
			self.line1avg=0
		if self.xpos>0:self.c.coords(self.lines[self.xpos],(self.xpos-1,self.lastpos,self.xpos,val))
		if self.xpos<800:self.c.coords(self.lineRedraw,(self.xpos+1,0,self.xpos+1,800))
		self.lastpos=val
		self.xpos+=1
		if self.xpos==800:
			self.xpos=0
			self.totalPoints+=800
			print "FPS:",self.totalPoints/(time.time()-self.timeStart)
		t.update()

	def __init__(self, t):
		self.xpos=0
		self.line1avg=0
		self.data=[0]*800
		self.c = Tkinter.Canvas(t, width=800, height=512)
		self.c.pack()
		self.totalPoints=0
		self.white()
		self.timeStart=time.time()

t = Tkinter.Tk()
a = App(t)

#ser = serial.Serial('COM1', 19200, timeout=1)
ser = serial.Serial('/dev/ttyUSB0', 38400, timeout=1)
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
sock.setsockopt(socket.SOL_SOCKET, socket.SO_BROADCAST, 1)

while True:
	while True: #try to get a reading
		#print "LISTENING"
		raw=str(ser.readline())
		#print raw
		raw=raw.replace("n","").replace("r","")
		raw=raw.split(",")
		#print raw
		try:
			point=(int(raw[0])-200)*2
			break
		except:
			print "FAIL"
			pass
	point=point/2
	a.addPoint(point)

If you re-create this device of a portion of it, let me know! I’d love to share it on my website. Good luck!





I’m staring deeply into a large, ominous, empty white text box on my screen as a small vertical text cursor blinks, staring back at me. I faintly remember writing my first blog entry eleven years ago (to the week), as a 15 year old boy typing on his keyboard in the middle of the night. Every time I log-in to write on my website, I see the same cursor, monotonous in its dependable, unvarying appearance.  What does the cursor see in me? Definitely not the same person it saw the last time I logged-in. What collection of words can even begin to describe the jumble that’s in my head? My name is Scott Harden, my future is changing before my eyes, and I’ve chosen to use this website to continue to document my life as it unfolds.

It’s been 298 days since I’ve last written. For anyone who’s been through a hard time, part of getting better is acting like ones self again. For me, that’s doing the things that I used to do: designing circuits, writing code, building radios, and documenting my projects on my website. Me, starting to write again – forcing myself to write again – is a step in a positive direction, and an indication that I am starting to be okay. My intention is to resume building and sharing projects on this website like I used to.

I feel it is important to address some of the recent changes in my life in a clear manner. I refuse to resume posting code and pictures of circuits all the sudden without acknowledging the serious issues that I’ve dealt with and am continuing to deal with. I choose not to ignore these things – a decision made out of respect, and out of logic. It’s important that it’s obvious that (a) these things happened, (b) I’m working to be okay with them, and (c) I’m not ignoring them. I have noticed that ignoring serious issues by pretending that they do not exist is one of the most dangerous and destructive coping mechanisms a person can exhibit. People behave this way around me often, but I, in an effort to improve my situation, refuse to ignore my challenges.

I will attempt to describe a few of the highlights of the last nine months of my life. It’s certainly not an exhaustive list, but it touches on a few of the significant experiences I have gone through, changes which have affected me, and a few of the events that made me happy since I last wrote here.

I began a new Ph.D. program in Neuroscience through the Interdisciplinary Program in Biomedical Science through the College of Medicine at the University of Florida. I am currently in my 4th year of dental school (in my 10th year of colletge currently), and I’m considering starting a new program from scratch. Rather than throw-away my dental degree (which I’ve been working on for several years and spent quite a lot of money to pursue), I am seeking a combined D.M.D/Ph.D. degree. This will allow me to graduate with a degree in dental medicine (D.M.D.) through the UF College of Dentistry about the same time I get my Ph.D. in Neuroscience through the UF College of Medicine. I begin this August, and I couldn’t be more excited. As my Google Scholar author page reports, I’ve had a lot of excellent experience publishing research in the past, and I’m eager to begin life again as a biomolecular graduate student. I have yet to decide the specific direction of my research, but if I could do anything in the world, I would try to find a project which lets me dip into bioinformatics. The pictures below are of me on the day of my PhD interview. I think I nailed it 🙂

It looks like I’m kind of sick. To make a long story short, I’m in a strange diagnostic limbo where it’s looking like I may have a pretty serious, rare, and somewhat difficult to diagnose form of cancer (stage 4 peripheral T-cell lymphoma), but at the same time it seems there is a chance it could be a weird, rare ailment that’s not as serious (an obscure autoimmune condition or abnormal presentation of chronic infection). The type of cancer it is currently proposed to be is pretty bad, and people usually don’t live too long after their diagnosis. Intriguingly, my symptoms and clinical presentation don’t reflect the rapidly destructive nature commonly associated with this disease, leaving open the possibility that it might not be that form of cancer after all. I have hard, enlarged lymph nodes (with abnormal histology demonstrated by 1 needle biopsy and 2 separate surgeries) in all quadrants of my body, and have a lot of fatigue (which I combat with large volumes of coffee). Other than that, I’m relatively normal (physically, at least). The picture to the right is me on the first visit to the oncologist. I didn’t take it too seriously. I kept thinking (and still do to some degree) there’s no way it could be that bad.  I mean, who my age gets sick like this? And right on the heels of losing my wife? It’s too coincidental, it can’t be that serious! Realistically we still don’t know for sure exactly what it is, and it’s too early to tell how this will affect my life. It’s a strange experience grappling with the idea that I might only live a few more years, knowing at the same time that my cancer might not be that bad and I may have a long, relatively normal life. I will know more a few months from now. In the mean time, I’m content planning my life as if I will be completely fine (which is certainly may be the case), and if I get sick along the way I’ll re-evaluate my situation if the need arises. For now, I’m quite positive. If anyone were to get a one in a million disease and get through it unscathed, it would be me!

Here are a few relevant images from my medical experiences within the last few months. The left image is my PET CT, which depicts active metabolism. I was injected with radioactive glucose, and the tissues that work hard absorb it and appear as “hot” on the image. I was lying down (for about 30 minutes), so the only active organs should be my heart and brain. The kidneys/bladder are illuminated because it filtered some of the material from the blood. Actively growing tumors also uptake glucose, and are labeled on this image. While the majority of my tumors are benign and slow growing, this image shows a few rapidly growing tumors: one on my upper right leg (which was removed with surgery #1, second picture), one small one at the base of my neck (which was removed with surgery #2, third picture), and a few scattered ones under each of my arms. The active ones were removed to aid in diagnosis, and it’s not feasible to remove all of the affected lymph nodes. The last picture shows me after my most recent surgery coming out of anesthesia. I don’t even remember it being taken – I was pretty high at the time, as you can infer from the photo 🙂

Many people have asked me how I am doing after the death of my wife Angelina. Unexpectedly losing a spouse at any age is difficult, but at such a young age (I was 25, she was 24, we were married for 4 years) it’s even harder to rationalize sometimes. No words I write can even begin to describe the complex array of emotions I carry. Although I wish things had turned-out differently, I still feel fortunate to have had Angelina in my life, and I know unquestionably that I am a better person for having known her, and am thankful we got to share as much as we did. The picture on the right is the last one I have of us together. (I’m in the blue shirt, she’s in the black shirt). It was taken about two weekends before she died. We went with a group of my dental school classmates to Islands of Adventure (part of Universal Studios) in Orlando. If you had told me then that I would be a widower in a couple weeks, I couldn’t have even begun to comprehend it. I will always miss her, but I know that she would not want me to destroy my life because she is gone. In a note she wrote for me the day she died, she said “I love you, Scott, more than you will ever know.” It’s a phrase I remember and repeat in my mind every day, and it’s given me comfort over the last several months. I know that I will always carry part of her with me.

Several people have e-mailed me in concern due to the fact that I had stopped posting over the last few months. Again, I was surprised and honored the unexpected support I’ve been getting over the internet from people I don’t really even know. The following quote is one that came in this morning.

It has been a while since I have last visited and I really hope you’re doing OK since your last post. I can’t pretend to remotely know how you feel, even with the time that has passed since then. Just wanted to say I wish you well and miss your posts. All I can say is that I know that there have been periods of time in my life where I put down the soldering iron due to life… even for a couple of years. But, I always come back around because its part of who I am… and I hope it will be for you as well.

–Jeremy

The truth is, when Angelina died, life stopped. I stopped answering email, I stopped adding to my website, I stopped making phone calls, I even distanced myself from my own family for a while. Slowly, one by one, I’m resuming these activities. A few months ago, I unpacked my electronics workstation in my new apartment. It was a little premature, as I only worked at my station 1 day and took a few more months before I came back to it, but now it’s a weekly process. For a long time I felt guilty even thinking about picking-up my silly projects again. I felt that, after everything she and I had gone through, it seemed fickle. With a little time and some self-reflection, I realized that Angelina encouraged me to do these things throughout our marriage. I remember her telling me that she went to college and bragged “my husband is on hack-a-day“. I told her that my academic publications on PubMed are more impressive, and I’ll always remember her response: “Yeah, but you get more excited every time you make it on hack-a-day”. I remember this and remind myself that, if she were here, I think she’d still continue to encourage me to do what I enjoy. While silly little projects might not seem significant, they do make me happy, and I think she would be proud of me for working on them.

Angelina left a few special reminders on my workbench. A few months after she died, and as I would unpacking my workstation again, I noticed a carving she made in my workbench. I almost threw away this table when I moved, and I’m so glad I didn’t. I think Angelina carved it with a car key, probably on the day this post was made, when I left her unattended for a while soldering at my station. I also found a message she wrote on my roll of solder. I didn’t remember seeing it before, so I imagine it was something she wrote the last few days she was alive. It is really special to me.

 One of the things I’ve done that’s really helped me a lot is to get out of my comfort zone a little bit and go places and do things I’ve never done before, often on a whim. This has led me to have a lot of a lot of fantastic experiences, and give me some new things to occupy myself with and remind me that there’s still a lot of life out there left to live. Over the last several months, I tried to minimize the amount of time I spent doing things I’d done before. I noticed that the more time I spent doing the same old things, the more I felt like a part of me was missing, and the worse I felt. Changing scenery and being in new places, it felt less like something was missing, and began letting me establish the feeling of being okay on my own. I stayed away from Gainesville as much as I could, stayed away from family as much as I reasonably could (sorry mom and dad if you read this one day), and kept the amount of time I spent interacting with my old friends from dental school to a minimum. For some reason, nothing made me feel worse inside than being around my old environment, and avoiding it for a while was my way of trying to heal. For several months, this behavior was the only thing that made me feel better. Luckily, within the last few weeks, things have started to settle down, and I’m a little more comfortable settling-back into some of my old environments. I’m going to toss-in a sizable collection of random photos from the last several months. I’m not going to describe them in detail – it’s more an indication that yes, I’m alive, and yes, I’m getting out there a little. I’ll post them in little clusters with descriptions in between…

I marched in the UF homecoming parade with the University of Florida Gator Amateur Radio Club. I participated in the “out of the darkness” walk for suicide awareness sponsored by the Suicide Survivors of Northeast Florida group. I volunteered to operate as net control for Gator Amateur Radio Society during the 5 points of life “relay for life” marathon (official site). I spent some time in Georgia and caught a sunrise on the east coast of Florida over the ocean.

I practiced advanced suturing techniques utilizing a pig jaw. I actually messed it up pretty badly… oops? Next time I’ll do it on a person. Scary, right? I took an awesome photo of the dental school early in the morning, and the radio antennas are visible on the top. I have a picture at the 2011 “art walk” in downtown Gainesville, Florida. A few days after my wife died, I got a new apartment and moved in with a dental school classmate. His name is Ray, and he’s a pretty cool guy. I’m very lucky to have him. He moved out of his old apartment and into my apartment in less than a week’s notice. I spent some time in Georgia and had the opportunity to operate my Yaesu 857-D portable. I have a 25 foot boom with an inverted-vee wire antenna fed by 300-ohm ladder line. I worked Italy on 40m (7MHz) CW (Morse code) on 20 watts.

I visited the base of the Appalachian Trail in Georgia, Bok Tower in Lake Wales, and a high-rise retirement home in Tampa. Another photo is of me with my two sisters (who were obviously adopted). You can see me operating as net control for the 5-points of life relay marathon early in the morning. Also there is a picture with me beside my friend Bill, W4HBK. He and his wife were wonderfully kind in offering me a place to stay over a portion of the Christmas break. Bill is a fellow QRSS’er (who mans the Pensacola Snapper) and we had a great time working with antennas, looking at QRP equipment, and testing 160M WSPR. It seemed like a random thing for me to do (drive 6 hours to spend several days with a family I hadn’t met before), but was a wonderful experience and I am thankful to both of them.

Here are some more photos of me working as net control for the race. I also went shooting with a sharpshooters group at a shooting range after hours. That’s a scoped 22, and it’s accurate as heck. It was pretty fun to shoot. I also began working on some new micro-controller projects, and I even purchased an arduino. I’ve had the arduino for a few months and haven’t used it for anything yet. Between you and me, it’s an annoyance since I’m so used to programming these ATMEL microcontrollers in AVR-GCC, but I figure that I should do at least *one* project with Arduino for the sake of doing it. It’s what everyone is using these days.

I started getting into electronics again a couple months ago. I learned how to etch PCBs at home (using the toner transfer method with hot ferric chloride) and am making digital QRSS transmitter designs. My signal is the one that looks drunk, swerving all over the QRSS road. The picture of me by a fence was in Ocala at a gokart racetrack. One of my professors took a few students, and it was a pretty cool day. It was the week I found out I was sick.

There I am shooting again at a friends house. I don’t remember the details of that weapon (I shot quite a few that day), but I think it was a 9mm with a red-dot scope. I spent a little more time in the wilderness up in Georgia, and had some time to stop by Georgia Tech two days ago. I visited to watch a friend’s friend give her master’s thesis defense. She did really well, and while I was there I snapped a picture of the campus indicating the proximity of the Georgia Tech Amateur Radio Club (W4AQL) to the Coca-Cola factory. Also pictured is my QRSS transmitter, a somewhat novel design using no potentiometers. The oscillator is a 74HC240 with a varactor diode, allowing frequency adjustment via a potentiometer, and a lowpass-filtered PWM line from the microcontroller to provide modulation. The degree of modulation is adjustable via the second potentiometer.

The nighttime image is a small restaurant a few friends took me to. The lights were spectacular. I’m also eating a frozen banana. I get my hair cut in my friend’s kitchen. His name is Don and he is a dental student too. My hair cuts are $5. I visited the coast and have a strange picture of me touching my hair. I operated the W4DFU radio station during the CQWW contest. I also visited a museum. Can you believe I went my entire life without ever actually seeing a dinosaur skeleton in person?

The photo from space indicates a successful retrieval of a camera that was launched back in August. It was placed on a high altitude balloon launched the same day Angelina and I drove to Orlando for the weekend (the next day the photo of us above was taken). It took photos as it ascended, then landed, and it was lost for several months. Some boys playing in the woods found it and gave it to their mom, who contacted us. The photos are pretty impressive, and I’ll post details in a future entry. There’s another photo of me on a beach, and one of me at a jellyfish exhibit. I look funny in the car playing with my camera. I recently realized that if I pushed the correct button, there was a camera facing me that could take a picture. Also there’s a photo of me at an old artillery battle fortress in Pensacola, Florida. I’m not much of a history buff, I’m not sure what it’s formal name is.

It’s nice being in the woods. A friend of my owns a house far enough out of town that we can shoot in the back yard. Bok tower in Lake Wales as seen from beneath a tree. I went there for a carillon concert one night after the sun went down. It was absolutely beautiful. I visited a small Irish pub in North Florida one day and listened to a small guitar player. It was so sparsely populated that, at one point, I think my table was the only one occupied. It was nice though. The picture of me sitting on the couch is from Christmas at a big family get-together. That was an experience. Like I mentioned above, my wife’s death hurts the most when I’m in a place where I feel she should be. When I do “old things”, and go to the places we used to go together, and be in situations where we always were together, her absence is unignorably overwhelming and painful. I imagine holidays will continue to hurt for a while. I don’t know how I got this photo, did my sister take it? Anyhow, that’s that. In another photo my fiend Ben sits at my workstation in one of the photos. Occasionally I invite engineering students to spend time at my apartment. With all the silly old electrical equipment I’ve amassed over the years, my apartment is turning into quite the electronics workshop! I’m posing in a tux with my sister, sporting a fine mustache.

At the March for Babies walk in Gainesville, Florida (where I served again as net control station for the event), there was a military representative who brought a rock climbing wall. I made it up two of the three paths. That third one was rough, and I never got to the top. My sister got married a few weeks ago. She went from Kelly Harden to Kelly Tran, and I got to be at the wedding. Actually, I was part of the wedding. I read Corinthians 13 during the service. It’s commonly known for “Love is patient, love is kind…“, but there were some ironic parts of the chapter that were a little hard to read: “If I have everything but do not have love, I am nothing” (1-3), “love always perseveres” (7), and “Love never fails” (8). I took it as a personal challenge, and made myself to be okay with the way life is. The harder something is for me emotionally, the more I feel like I’ve grown after getting through it. There’s another photo of me at my workstation. I don’t think I knew it was being taken. There’s a picture of a poster I found in a shoe store. The message was pretty cool, so I snapped a picture. One of the days working in the dental clinic, let’s just say I had “computer problems”.

Kicking-off the transition back into internet productivity, I will share some code which I utilized to create today’s entry. I wanted to rapidly browse through numerous high resolution images to choose which ones to share, but pressing the left and right arrows was too slow (some of the large images took 5+ seconds to load). I therefore copied all of the images (*.jpg) created within the last 6 months and copied them into a single folder. I whipped-up a script to use ImageMagick’s convert feature (which is available for linux and windows) to create 300px wide thumbnails and place them on a static HTML page for viewing in a browser. Each thumbnail was clickable, revealing the original, large image. Below is the script I used to do this.

import os
out="<html><body>"
fnames=os.listdir("./")
fnames.sort() # alphabetize them
for fname in fnames:
if ".jpg" in fname or ".JPG" in fname:
if "sml_" in fname: continue
cmd="convert -resize 300 %s sml_%s"%(fname,fname)
print cmd #note that this requires ImageMagik installed
os.system(cmd)
out+='<a href="%s">'%(fname)
out+='<img src="sml_%s"></a><br>'%(fname)
out+='%s<br><br><br>n'%(os.path.split(fname)[1])
out+="</body></html>"
f=open("pics.html",'w')
f.write(out)
f.close()
print "DONE"

What do I plan to do from here? I’m starting my PhD program in August, and I plan to hit it hard. I feel lucky that I have the opportunity to study biomolecular science again. I’m holding my breath for it to start, and am looking forward to having something academically and intellectually challenging to devote myself to. I plan to continue making my electronics and programming projects, and publishing them here on my website. If my health changes, it all might change, but I’m holding out that I’ll be okay.  All in all, I’ve lived through an unbelievably rough year, but I look forward to getting back into the game and pushing into life head first.





I’m working to further simplify my frequency counter design. This one is simpler than my previous design both in hardware and software! Here’s a video to demonstrate the device in its current state:

I utilize the ATMega48’s hardware counter which is synchronous with the system clock, so it can only measure frequency less than half of its clock speed. I solve this issue by dividing the input frequency by 8 and clocking the chip at 12mhz. This allows me to measure frequencies up to about 48MHz, but can be easily adapted to measure over 700MHz (really?) by dividing the input by 128. Division occurs by a 74HC590 8-bit counter (not a 74HC595 as I accidentally said in the video, which is actually a common shift register), allowing easy selection of input divided by 1, 2, 4, 8, 16, 32, 64, or 128. The following image shows the o-scope showing the original signal (bottom) and the divided-by-8 result (top)
DSCN1630

The device outputs graphically to a LCD simply enough. That LCD is from eBay and is only $3.88 shipped! I’m considering buying a big box of them and implementing them in many more of my projects. They’re convenient and sure do look nice!
DSCN1634

The signal I test with comes from an oscillator I built several months ago. It’s actually a SA612 style receiver whose oscillator is tapped, amplified, and output through a wire. It’s tunable over almost all of 40m with a varactor diode configuration. It was the start of a transceiver, but I got so much good use out of it as a function generator that I decided to leave it like it is!
DSCN1637

THIS IS HOW THE PROGRAM WORKS: I don’t supply a schematic because it’s simple as could be. Divide the input frequency to something relatively slow, <1MHz at least. Configure the 16-bit counter to accept an external pin as the counter source (not a prescaled clock, as I often use in other applications). Then set the timer value to 0, _delay_ms() a certainly amount of time (1/10th second), and read the counter value. Multiply it by 10 to account for the 1/10th second, then multiply it by 8 to account for the divider, and it’s done! It will update 10 times a second, with a resolution down to 10*8 = 80 Hz. It’s well within the range of amateur radio uses! If you’re considering replicating this, read up on how to use hardware counters with ATMEL AVR microcontrollers. That should be enough to get you started! Here’s the code I used…

For the LCD, this code requires LCD library.

#include <stdlib.h>
#include <avr/io.h>
#include <avr/pgmspace.h>
#include <util/delay.h>
#include "lcd.h"
#include "lcd.c"

int main(void)
{
	TCCR1B=0b00000111; // rising edge trigger
    char buffer[8];
	long toshow=0;
	char mhz=0;
	int khz=0;
	int hz=0;
    lcd_init(LCD_DISP_ON);
	for(;;){
    	lcd_clrscr();

    	lcd_gotoxy(0,0);
		itoa(mhz , buffer, 10);
		lcd_puts(buffer);
		lcd_puts(".");

		if (khz<100){lcd_puts("0");}
		itoa(khz , buffer, 10);
		lcd_puts(buffer);

		itoa(hz/100 , buffer, 10);
		lcd_puts(buffer);

		lcd_puts(" MHz");

		TCNT1=0;
		_delay_ms(99);
		_delay_us(312);
		toshow=TCNT1;
		toshow=(long)toshow*16*10; // tenth second period
		mhz=toshow/1000000;
		toshow=toshow-mhz*1000000;
		khz=toshow/1000;
		toshow=toshow-khz*1000;
		hz=toshow;
	}
}




While trying to attack the problem described in the previous entry, it became clear that a logic analyzer would be necessary. I thought I’d try to build one, and my first attempt was so close to being successful, but not quite there. It records 19 channels (the maximum pins available on the ATMega48 not being occupied by the status LED or USB connection pins) at a rate just under 1,000 samples per second. The USB connection to the PC is obvious, and it utilizes the V-USB project to bit-bang the USB protocol. I’m posting this in part because some of the comments to my entry two posts ago were disheartening, discouraging, and even down-right viscous! I made a simple way to send numbers to a PC through the sound card, so what? Don’t be nasty about it! Meh, internet people. Anyway, here’s a marginally more proper way to send data to a PC with USB and an AVR (logging and interface designed in python), but I’ll probably still get yelled at for it.

As you can see from the video, it’s good but not good enough. If I could get samples at 2,000 per second I’d probably be OK, but it’s just not quite fast enough with it’s current, ultra-simplistic method of sample recording. I’ll figure out a fancier way to build a spectrum analyzer – it’s obvious the platform is there, it just needs some refinement.

A few stills:
diy logic analyzer 1
diy logic analyzer 2

UPDATE! The more I think about it, the more I think this might be just good enough to work! Look at the stagger in those peaks near the top – that’s probably the lines telling which character to display. Data between the peaks indicates the value to be provided, and I should have enough time to accurately measure that… Maybe this is good enough after all? I’ll have to run some more tests tomorrow…

Where’s the code? It kills me to do this, but I need to withhold the chip side code. I’m working on an idiot’s guide to USB connectivity with AVR microcontrollers, and I’d rather post the simplest-case code first, then share complicated stuff like this. I’ll post the python scripts:

# LOGIC.PY - this script will log (or print) raw data from the USB device
from optparse import OptionParser
import time
import usb.core
import usb.util
import os

while True:
        print "nTrying to communicate with the Gator Keyer ...",
        dev = usb.core.find(find_all=True, idVendor=0x16c0, idProduct=0x5dc)
        if len(dev)==0: print "FAIL"
        dev=dev[0]
        dev.set_configuration()
        print "there it is!"
        break


def readVals():
    x=dev.ctrl_transfer(0xC0, 2, 2, 3, 4).tolist()
    return x

def toBinary(desc):
	bits=[]
	for i in range(7,-1,-1):
		if (2**i>desc):
			bits.append('0')
		else:
			bits.append('1')
			desc=desc-2**i
	return bits

def toStr(lists):
	raw=[]
	for port in lists: raw+=toBinary(port)
	return ''.join(raw)






### PROGRAM START ##################
live=False
#live=True
start=time.time()
if live==True:
	while True:
		a,b,c,d=readVals()
		if not a==123: continue #bad data
		elapsed=time.time()-start
		print "%.010f,%s"%(elapsed,toStr([b,c,d]))
else:
	times=0
	data=''
	f=open("out.txt",'a')
	while True:
		a,b,c,d=readVals()
		if not a==123: continue #bad data
		elapsed=time.time()-start
		data+="%.010f,%sn"%(elapsed,toStr([b,c,d]))
		times+=1
		if times%1000==999:
			print "%d readings / %.02f = %.02f /sec"%(times,elapsed,times/elapsed)
			f.write(data)
			data=""
#logicGraph.py - this will show the data in a pretty way
import matplotlib.pyplot as plt
import numpy

c={
0:"",
1:"",
2:"blk sol",
3:"yel str",
4:"yel sol",
5:"pur sol",
6:"pur str",
7:"",
8:"",
9:"",
10:"blu sol",
11:"blu str",
12:"orn sol",
13:"orn str",
14:"pnk sol",
15:"pnk str",
16:"",
17:"",
18:"",
19:"",
20:"",
21:"",
22:"",
23:"",
24:"",
}

print "loading"
f=open("out.txt")
raw=f.readlines()
f.close()

print "crunching"
times=numpy.array([])
data=numpy.array([])
for line in raw:
	if len(line)<10: continue
	line=line.replace("n",'').split(',')
	times=numpy.append(times,float(line[0]))
	bits = []
	for bit in line[1]:
		if bit=="1":bits.append(1)
		else:bits.append(0)
	data=numpy.append(data,bits)

columns=24
rows=len(data)/columns
data=numpy.reshape(data,[rows,columns])
print "DONE processing",len(data),"linesnn"
print "plotting..."
plt.figure()
plt.grid()
for i in range(len(c.keys())):
	if c[i]=="": continue
	plt.plot(times,data[:,i]+i*1.1,'-',label=c[i])
plt.legend()
plt.show()




 

[warning]

UPDATE: Check out what happened when I revisited this project and made it wireless two years later!:

http://www.swharden.com/blog/2013-05-19-wireless-microcontroller-pc-interface-for-3-21/

[/warning]

 

This page describes a method of sending data from a microchip to a PC using pulses of data. It’s an alternative to more traditional serial or USB methods of connectivity. It’s not intended as a solution for consumer products, but rather an easy hack for hobbyists to employ if they don’t have the equipment for other methods. This method doesn’t require *any* circuitry, just a sound card. The one built in your computer is fine, but I’m using a $1.30 USB sound card for simplicity. It boils down to just a single microcontroller pin connected to a PC sound card microphone jack!

UPDATE: This story was featured on this post of HackADay.com!

This is the finished product ready to send data to a PC:DSCN1532

MY PROBLEM: I want to send data from a simple microcontroller to a PC. While USART and a serial port is the common solution like I’ve done before, it’s not convenient because it requires a level converter (like a MAX232, about $4), crystal (specific values based on bit and error rate, if you’re lucky you might have a right value in your junk box), and an archaic PC which actually has a serial port. A usb serial port adapter sounds clever, but many aren’t supported on Linux, Windows Vista, or Windows 7. Also, many small chips (most of the ATTiny series) don’t have built in serial capabilities, so it has to be bit-banged in software! Yuk! The second choice would be USB. This requires a crystal too, zener diodes, and bit-banging the USB protocol with something like V-USB since most of the AVR series don’t have built in USB (do they even make breadbordable DIP chips with USB?). Even so, it requires drivers, custom software, cross-platform frustrations, etc. I know PIC has some 18f series chips with USB, but I don’t feel like switching architectures just to send a few bytes of data to a PC. FDTI has a FT232R chip which is a USB serial port adapter, but it’s expensive (about $5) and doesn’t come in dip, so no breadboarding! Sure there are adapter boards, but that just adds the cost. I’m not excited about a $5 solution for a $1 microcontroller. I even did a bit of trolling on AVR Freaks to see if anyone could help me out – just more of the same!

MY SOLUTION: Send data through the sound card! USB sound cards are $1.30 (shipped) on eBay! It couldn’t be simpler. Send pulses, measure distance between pulses. Short pulses are a zero, longer ones are a 1, and very long pulses are number separators. A Python solution with PyAudio allows 1 script which will work on Mac, Linux, Windows, etc, and because it calibrates itself, this will work on any chip at any clock rate. Data is initiated with calibration pulses so timing is not critical – the PC figures out how fast the data is coming in. Check it out! (scroll way down for a bidirectional communication solution)

Here is a sound card I used for bidirectional communication:
DSCN1466 DSCN1470

Output graph (python and excel) of temperature when I put a soldering iron near the sensor: pythonexcel

~ UNIDIRECTIONAL SOLUTION ~

The following code is designed to have a chip send data to your PC automatically. This can be run on any micro-controller (PIC or AVR I guess, the concept is the same) at any clock rate. Just make sure the sound card is recording fast enough to differentiate pulses. (keep scrolling down for a bidirectional solution)

A NOTE ABOUT MY CODE: This is just the code I used for my demonstration. It might be more logical for you to write your own since the concept is so simple. I’m a dental student, not a programmer, so I’m sure it’s not coded very elegantly. I didn’t work hard to make this code easy to read or easy to share. With that being said, help yourself!

/*The following code is written in AVR-GCC for an ATTiny44a.
It reads ADC values on 3 pins and reports it each second along
 with a number which increments each time data is sent.
It's designed as a starting point, allowing anyone to
customize it from here!*/

#include <avr/io.h>
#include <avr/delay.h>
#include <avr/interrupt.h>

// bytes we want to send to the PC
volatile int data1=0;
volatile int data2=0;
volatile int data3=0;
volatile int data4=0;

void solid(){  // dont touch
	_delay_ms(1);
	pulse(1);pulse(1);pulse(1);pulse(3);pulse(3);
	pulse(3);pulse(5);pulse(5);// CALIBRATION PULSES
}
void pulse(char size){ // dont touch
	PORTA|=_BV(PA3);
	_delay_us(100);
	PORTA&=~_BV(PA3);
	while (size){size--;_delay_us(100);}
}
void sendVal(unsigned long tosend){ // dont touch
	pulse(5); // send a space
	while (tosend){
		if (tosend&1){pulse(3);} // send ONE
		else {pulse(1);} // send ZERO
		tosend=tosend>>1;
	}
}

int readADC(char adcNum){
	_delay_ms(1);
	ADMUX=adcNum; // select which ADC to read, VCC as ref.
	ADCSRA=0b11000111; // enable, start, 128 prescale
    while (ADCSRA&( 1<<ADSC)) {}; // wait for measurement
	return ADC;
}

void takeReadings(){
        data1=readADC(0); // ADC0
        data2=readADC(1); // ADC1
        data3=readADC(2); // ADC2
		data4++; // incriment just because we want to
}

void sendStuff(){ // EDIT to send what you want
	solid(); //required
	sendVal(12345); //required
	sendVal(12345); //required
	sendVal(54321); //required

	sendVal(data1);
	sendVal(data2);
	sendVal(data3);
	sendVal(data4);

	pulse(1); //required
}

int main(){
	DDRA|=_BV(PA2)|_BV(PA3);
	for (;;){
		_delay_ms(1000);
		takeReadings();
		sendStuff();
	}
	return 0;
}
"""
file name: listenOnly.py

This is the PC code to listen to the microphone and display
and log the data. It probably does NOT need adjustment!
 Make sure the correct sound card is selected (in the code)
 and make sure microphone input is turned up in volume control.

This code is what was used on my PC for the demonstration
video. This is the listenOnly.py file which will turn any audio
 detected from a sound card into data, optionally logging it
(if the last few lines are uncommented). This also works to
capture data for the bidirectional communication method,
described below on this website.

If this is running but no data is coming through, make sure the
microphone is selected as a recording device, the correct sound
card is selected, and the microphone volume is turned to high.

REQUIRED: To run this, you need to have the following installed:
-- Python 2.6
-- numpy for python 2.6
-- matplotlib for python 2.6
-- pyaudio for python 2.6
(other versions may work, but this is what I'm using)
"""
import numpy
import pyaudio
import matplotlib.pyplot as plt
import wave
import time

def listCards(dontAsk=True):
    p=pyaudio.PyAudio()
    print "SOUND CARDS:"
    for i in range(p.get_default_host_api_info()["deviceCount"]):
        if p.get_device_info_by_index(i)["maxInputChannels"]>0:
                cardName = p.get_device_info_by_index(i)["name"]
                cardIndex = p.get_device_info_by_index(i)["index"]
                print "[%d] %s"%(cardIndex,cardName)
    if dontAsk: return
    return int(raw_input("CARD NUMBER TO USE:"))

cardID=1
listCards()
print "USING CARD:",cardID

rate=44100.0
sampleSize=1024

def data2vals(data):
    vals=numpy.array([])
    lastPeak=0
    for i in range(1,len(data)):
        if data[i]==True and data[i-1]==False:
            if lastPeak>0: vals=numpy.append(vals,i-lastPeak)
            lastPeak=i
    return vals

def binary2dec(binary):
    binary=binary[:-1]
    dec=0
    s=""
    for i in range(len(binary)):
        dec=dec*2
        dec+=binary[i]
        s="%d"%binary[i]+s
    #print s,"=",dec #11111100101100000 = 3391
    return dec

def readVals(vals):
    if len(vals)<7: return False
    vals2=[]
    aLow = min(vals[0:3])
    aMed = min(vals[3:6])
    aHigh = vals[6]
    thresh1=sum([aLow,aMed])/2+2
    thresh2=sum([aMed,aHigh])/2+2
    #print "tresholds:",thresh1,thresh2
    #print vals
    vals=vals[8:]
    binary=[]
    for i in range(len(vals)):
        if vals[i]>thresh2:
            vals2.append(binary2dec(binary))
            binary=[]
        if vals[i]>thresh1:binary=[1]+binary
        else:binary=[0]+binary
    vals2.append(binary2dec(binary))
    for i in range(len(vals2)):
        if vals2[i]==54321: return vals2[i+1:]
    return False

def playFile():
    chunk = 1024
    wf = wave.open("short_onenum.wav", 'rb')
    p = pyaudio.PyAudio()
    stream = p.open(format =
                    p.get_format_from_width(wf.getsampwidth()),
                    channels = wf.getnchannels(),
                    rate = wf.getframerate(),
                    output = True)
    data = wf.readframes(chunk)
    while data != '':
        stream.write(data)
        data = wf.readframes(chunk)
    stream.close()

def captureData():
    pyaud = pyaudio.PyAudio()
    stream = pyaud.open(format=pyaudio.paInt16,channels=1,
        rate = 44100,input_device_index=cardID,input=True,output=True)
    sample=numpy.array([])
    while True:
        sampleNew=numpy.fromstring(stream.read(sampleSize),dtype=numpy.int16)
        sampleNew=(sampleNew<-25000)*1
        if True in sampleNew: sample=numpy.append(sample,sampleNew)
        else:
            if len(sample):
                stream.close()
                return sample
    stream.close()

tone_quiet=0

def buildNumber(num=123):

    if num>255: print "NUMBER TOO HIGH!!!"
    #print num,'=',
    num+=1
    for i in [7,6,5,4,3,2,1,0]:
        if num>2**i:one();num=num-2**i;#print"1",
        else: zero();#print"0",
    #print
    space()

def pulse():
    global data
    data+=[-30000]*10

def space():
    global data
    data+=[tone_quiet]*900
    pulse()

def one():
    global data
    data+=[tone_quiet]*600
    pulse()

def zero():
    global data
    data+=[tone_quiet]*300
    pulse()

def silence(msec=1000):
    global data
    data+=[tone_quiet]*int(41.1*msec)

data=[]
def sendAudio(numbers=[11,66,77]):
    global data
    data=[]
    silence(100)
    buildNumber(250)
    print "SENDING",
    for numba in numbers:
        buildNumber(numba)
        print numba,
    buildNumber(250)
    silence(100)
    data=numpy.array(data)
    data=-data
    data=data.tostring()
    print

    p = pyaudio.PyAudio()
    stream = p.open(rate=44100, channels=1, format=pyaudio.paInt16,
                    input_device_index=cardID, output=True)
    stream.write(data)
    stream.close()
    p.terminate()

i=0
while True:
    i+=1
    val=readVals(data2vals(captureData()))
    if val == False: continue
    line=""
    for item in val: line+=str(item)+","
    print i,line
    #f=open('log.csv','a')
    #f.write("%sn"%line)
    #f.close()

~ BIDIRECTIONAL SOLUTION ~

What if we want to send data TO the microcontroller? The solution is a little more complex, but quite doable. Just add an extra wire to the sound card’s speaker output and attach it to PCINT0 (the highest level internal interrupt). This is intended for advanced users, and if you’re doing this you probably are better off with USB or serial anyway! … but heck, why not do it as a proof of concept!

Note that the USB sound card speaker output was not powerful enough to trigger the digital input pin of the AVR, so an inverting buffer was made from a single NPN transistor (2n3904). The hardware interrupt was attacked to the collector, and the collector was attached through +5V through a 220 ohm resistor. The emitter was grounded. The base was attached directly to the sound card output. I also tried running the sound card output through a small series capacitor (0.1uF) and biasing the base to ground through a 1Mohm resistor and it worked the same. Hardware, simple. Chip-side software… a little more complex.

### VIDEO ###

"""
This code is what was used on my PC for the
 demonstration video. The listenonly.py file
 (above on site) was also used without modification.
"""
import pyaudio
from struct import pack
from math import sin, pi
import wave
import random
import numpy
import time

RATE=44100
maxVol=2**15-1.0 #maximum amplitude
p = pyaudio.PyAudio()
stream = p.open(rate=44100, channels=1, format=pyaudio.paInt16,
		input_device_index=1, output=True)

def pulseZero():
    global wvData
    wvData+=pack('h', 0)*30
    wvData+=pack('h', maxVol)

def pulseOne():
    global wvData
    wvData+=pack('h', 0)*40
    wvData+=pack('h', maxVol)

def pulseSpace():
    global wvData
    wvData+=pack('h', 0)*50
    wvData+=pack('h', maxVol)

def buildNumber(num=123):
    if num>255: print "NUMBER TOO HIGH!!!"
    num+=1
    for i in [7,6,5,4,3,2,1,0]:
        if num>2**i:
            pulseOne()
            num=num-2**i
        else:
            pulseZero()

wvData=""
wvData+=pack('h', 0)*2000
pulseOne() #required before sending data

buildNumber(55)
buildNumber(66)
buildNumber(77)
buildNumber(123)

wvData+=pack('h', 0)*2000

while True:
	print "SENDING",
	stream.write(wvData)
	raw_input()
/*
This code is what was used on my AVR
microcontroller for the demonstration video
*/
#include <avr/io.h>
#include <avr/delay.h>
#include <avr/interrupt.h>

volatile long commandIncoming=0;
volatile char command1=0;
volatile char command2=0;
volatile char command3=0;
volatile char command4=0;
volatile char bitsGotten=0;

// timing thresholds are critical! Send pulses to the chip
// and have it report the time between them. Use this to
// determine the best threshold value for your application.
// The ones here must be changed if you run at a speed other
// than 1mhz or if you use different timings in PC software
#define thresh_low 100 // between this and the next
#define thresh_high 130 // is the range for a logical 'one'

// ######## OUTGOING AUDIO DATA #########
void solid(){
	_delay_ms(1); //LONG LOW
	pulse(1);pulse(1);pulse(1);pulse(3);pulse(3);
	pulse(3);pulse(5);pulse(5);// CALIBRATION PULSES
}
void pulse(char size){
	PORTA|=_BV(PA3);
	_delay_us(100);
	PORTA&=~_BV(PA3);
	while (size){size--;_delay_us(100);}
}
void sendVal(unsigned long tosend){
	pulse(5); // send a space
	while (tosend){
		if (tosend&1){pulse(3);} // send ONE
		else {pulse(1);} // send ZERO
		tosend=tosend>>1;
	}
}

// ######## INCOMING AUDIO DATA #########
// NOTE THAT INPUTS ARE NORMALLY *HIGH* AND DROP *LOW* FOR SIGNAL
SIGNAL (PCINT0_vect) { // audio input trigger
	TIMSK0|=(1<<TOIE1); //Overflow Interrupt Enable
	if (TCNT0<10){return;} // seem too fast? ignore it!
	// Enable the following line to test custom timings
	//command1=command2;command2=command3;
	//command3=command4;command4=TCNT0;
	bitsGotten++;
	commandIncoming=commandIncoming*2; // shift left
	if (TCNT0>thresh_low){commandIncoming++;} // make 1
	TCNT0=0;
}

ISR(TIM0_OVF_vect){ // TIMER OVERFLOW
	if (bitsGotten){sendStuff();}
}

void fillCommands(){
	command1=(char*)(commandIncoming>>24);
	command2=(char*)(commandIncoming>>16);
	command3=(char*)(commandIncoming>>8);
	command4=(char*)(commandIncoming);
}

void sendStuff(){
	TIMSK0=0; //Overflow Interrupt
	cli(); // disable interrupts!
	fillCommands();
	solid(); // start data transmissions with this
	sendVal(12345);
	sendVal(12345);
	sendVal(54321);
	sendVal(command1);
	sendVal(command2);
	sendVal(command3);
	sendVal(command4);
	sendVal(1234567890);
	pulse(1);
	bitsGotten=0;
	sei(); // enable interrupts again!
	TIMSK0|=(1<<TOIE1); //Overflow Interrupt
}

// ######## MAIN PROGRAM #########
int main(){

	DDRA|=_BV(PA2)|_BV(PA3);

	// SET UP FOR SOUND CARD INTERRUPT
	MCUCR = 0b00000010; // trigger interrupt on falling edge
	GIMSK = 0b00010000; // pin change interrupt enable 0
	GIFR =  0b00010000; // flag register, same as above
	PCMSK0 = (1<<PCINT0); // Set Pin to use (PCINT0)
	sei(); // enable global interrupts

	// SET UP 8-bit COUNTER
	TCCR0B|=0b00000010;
	//TCCR1B|=(1<<CS12)|(1<<CS10); // prescaler 1024
	TIMSK0|=(1<<TOIE1); //Enable Overflow Interrupt Enable
	TCNT0=0;//Initialize our varriable (set for 1/15th second?)

	// MAIN PROGRAM
	for (;;){}
	return 0;

}

In closing, I’m tickled this works so well. It’s funny to me that no one’s really done this before in the hobby field. I’m sure I’m not the only one who wished there were an easy way to do this. I’m sure the process could be greatly improved, but this is a fun start. Wow, it’s late, I should get to bed. I have to treat patients tomorrow morning!

PS: If you replicate this concept, let me know about it! I’d love to see your project!





My current project involves needing to create stereo audio in real time with Python. I’m using PyAudio to send the audio data to the sound card, but in this simple example I demonstrate how to create mono and stereo sounds with Python. I’m disappointed there aren’t good simple case examples on the internet, so I’m sharing my own. It doesn’t get much easier than this!

# Python 2 example
from struct import pack
from math import sin, pi
import wave
import random

RATE=44100

## GENERATE MONO FILE ##
wv = wave.open('test_mono.wav', 'w')
wv.setparams((1, 2, RATE, 0, 'NONE', 'not compressed'))
maxVol=2**15-1.0 #maximum amplitude
wvData=""
for i in range(0, RATE*3):
	wvData+=pack('h', maxVol*sin(i*500.0/RATE)) #500Hz
wv.writeframes(wvData)
wv.close()

## GENERATE STERIO FILE ##
wv = wave.open('test_stereo.wav', 'w')
wv.setparams((2, 2, RATE, 0, 'NONE', 'not compressed'))
maxVol=2**15-1.0 #maximum amplitude
wvData=""
for i in range(0, RATE*3):
	wvData+=pack('h', maxVol*sin(i*500.0/RATE)) #500Hz left
	wvData+=pack('h', maxVol*sin(i*200.0/RATE)) #200Hz right
wv.writeframes(wvData)
wv.close()

The output is two sound files which look like this:

mono
stereo

Here’s a more modern Python 3 example:

"""Python 3 script to make a mono WAV file with an audio tone"""

from struct import pack
from math import sin, pi
import wave
import random
from os.path import abspath

# create a bytestring containing "short" (2-byte) sine values
SAMPLE_RATE = 44100
waveData = b''
maxVol = 2**15-1.0
frequencyHz = 500.0
fileLengthSeconds = 3
for i in range(0, SAMPLE_RATE * fileLengthSeconds):
    pcmValue = sin(i*frequencyHz/SAMPLE_RATE * pi * 2)
    pcmValue = int(maxVol*pcmValue)
    waveData += pack('h', pcmValue)

# save the bytestring as a wave file
outputFileName = 'output.wav'
wv = wave.open(outputFileName, 'w')
wv.setparams((1, 2, SAMPLE_RATE, 0, 'NONE', 'not compressed'))
wv.writeframes(waveData)
wv.close()
print(f"saved {abspath(outputFileName)}")




I’m sitting in class frustrated as could be. The Internet in this room (D3-3 in the dental tower of Shands Hospital at UF) is unbelievably annoying. For some reason, everything runs fine, then functionality drops to unusable levels. Downloading files (i.e., PDFs of lectures) occurs at about 0.5kb/s (wow), and Internet browsing is hopeless. At most, I can connect to IRC and enjoy myself in #electronics, #python, and #linux. I decided to channel my frustration into productivity, and wrote a quick Python script to let me visualize the problem.out

Notice the massive lag spikes around the time class begins. I think it’s caused by the retarded behavior of windows update and anti-virus software updates being downloaded on a gazillion computers all at the same time which are required to connect to the network on Windows machines. Class start times were 8:30am, 9:35am, and 10:40am. Let’s view it on a logarithmic scale:out2

Finally, the code. It’s two scripts. One pings a website (kernel.org) every few seconds and records the ping time to “pings.txt”, and the other graphs the data. Here are the two scripts:

import socket, time, os, sys, re

def getping():
	pingaling = os.popen("ping -q -c2 kernel.org")
	sys.stdout.flush()
	while 1:
		line = pingaling.readline()
		if not line: break
		line=line.split("n")
		for part in line:
			if "rtt" in part:
				part=part.split(" = ")[1]
				part=part.split('/')[1]
				print part+"ms"
				return part

def add2log(stuff):
	f=open("pings.txt",'a')
	f.write(stuff+",")
	f.close()

while 1:
	print "pinging...",
	stuff="[%s,%s]"%(time.time(),getping())
	print stuff
	add2log(stuff)
	time.sleep(1)
import pylab, time, datetime, numpy

def smoothTriangle(data,degree,dropVals=False):
	triangle=numpy.array(range(degree)+[degree]+range(degree)[::-1])+1
	smoothed=[]
	for i in range(degree,len(data)-degree*2):
		point=data[i:i+len(triangle)]*triangle
		smoothed.append(sum(point)/sum(triangle))
	if dropVals:
		print "smoothlen:",len(smoothed)
		return smoothed
	#smoothed=[smoothed[0]]*(degree+degree/2)+smoothed
	#while len(smoothed)<len(data):smoothed.append(smoothed[-1])
	while len(smoothed)<len(data):smoothed=[None]+smoothed+[None]
	if len(smoothed)>len(data):smoothed.pop(-1)
	return smoothed

print "reading"
f=open("pings.txt")
raw=eval("[%s]"%f.read())
f.close()

xs,ys,big=[],[],[]
for item in raw:
	t=datetime.datetime.fromtimestamp(item[0])
	maxping=20000
	if item[1]>maxping or item[1]==None:
		item[1]=maxping
		big.append(t)
	ys.append(float(item[1]))
	xs.append(t)

#print xs
#raw_input("WAIT")

print "plotting"
fig=pylab.figure(figsize=(10,7))
pylab.plot(xs,ys,'k.',alpha=.1)
pylab.plot(xs,ys,'k-',alpha=.1)
pylab.plot(xs,smoothTriangle(ys,15),'b-')
pylab.grid(alpha=.3)
pylab.axis([None,None,None,2000])
#pylab.semilogy()
#pylab.xlabel("time")
pylab.ylabel("latency (ping kernel.org, ms)")
pylab.title("D3-3 Network Responsiveness")
fig.autofmt_xdate()
#pylab.show()
pylab.savefig('out.png')
pylab.semilogy()
pylab.savefig('out2.png')
fig.autofmt_xdate()
print "done"




Warning: This post is several years old and the author has marked it as poor quality (compared to more recent posts). It has been left intact for historical reasons, but but its content (and code) may be inaccurate or poorly written.

The VD Labs web page has been published! I hope that the new VD Labs page will be a single location where I can link to descriptions and downloads of useful radio, audio analysis, and QRSS-related software. It will eventually be the home of the next (recorded-from-scratch) version of QRSS VD, but let’s not get too far ahead of ourselves!

vd labs flyer

Since I ran out of steam from working so much on QRSS VD, I didn’t think I’d be publishing mush more “useful” software, but this one blind-sighted me. People on the Knights QRSS mailing list were talking about dividing QRSS transmissions into images which line up with the period of the transmitters repeated messages and projecting the images together in an attempt to average-out the noise, and boost the signal. It’s a simple idea, and it’s the basis behind how a lot of poor imaging devices can improve their output clarity by software (MRI anyone?). I was overwhelmed by dental school obligations the last few weeks, and it pained me so much to read what people were doing (or at least trying to do) and having to sit it out. Now that I have a free day (yay for weekends!) I sat down and wrote some code. I introduce VD Labs QRSS Stitcher and QRSS Stacker!

Converting Argo captures into continuous images:

example output:

stitched

Doing the same thing, with ultra-narrow images:

File produced:

stacked_narrow

Using QRSS Stacker to project images:

Another example output:

stacked_stitched

Screenshots:

vd labs qrss stacker
vd labs qrss stitcher




Warning: This post is several years old and the author has marked it as poor quality (compared to more recent posts). It has been left intact for historical reasons, but but its content (and code) may be inaccurate or poorly written.

My expression is completely flat right now. I simply cannot believe I’m about to say what I’m preparing to say. I spent nearly a year cracking large prime numbers. In short, I took-on a project I called The Flowering N’th Prime Project, where I used my SheevaPlug to generate a list of every [every millionth] prime number. The current “golden standard” is this page where one can look-up the N’th prime up to 1 trillion. My goal was to reach over 1 trillion, which I did just this morning! I was planning on being the only source on the web to allow lookups of prime numbers greater than 1 trillion.

flowering_primes

However, when I went to look at the logs, I realized that the software had a small, fatal bug in it. Apparently every time the program restarted (which happened a few times over the months), although it resumed at its most recent prime number, it erased the previous entries. As a result, I have no logs below N=95 billion. In other words, although I reached my target this morning, it’s completely irrelevant since I don’t have all the previous data to prove it. I’m completely beside myself, and have no idea what I’m going to do. I can start from the beginning again, but that would take another YEAR. [sigh]

So here’s the screw-up. Apparently I coded everything correctly on paper, but due to my lack of experience I overlooked the potential for multiple appends to occur simultaneously. I can only assume that’s what screwed it up, but I cannot be confident. Honestly, I still don’t know specifically what the problem is. All in all, it looks good to me. Here is the relevant Python code.

def add2log(c,v):
 f=open(logfile,'a')
 f.write("%d,%dn"%(c,v))
 f.close()

def resumeFromLog():
 f=open('log.txt')
 raw=f.readlines()[-1]
 f.close()
 return eval("["+raw+"]")

For what it’s worth, this is what remains of the log file:

953238,28546251136703
953239,28546282140203
953240,28546313129849
...
1000772,30020181524029
1000773,30020212566353
1000774,30020243594723




Warning: This post is several years old and the author has marked it as poor quality (compared to more recent posts). It has been left intact for historical reasons, but but its content (and code) may be inaccurate or poorly written.

My goal is to create a QRPP (extremely low power) transmitter and modulation method to send QRSS (extremely slow, frequency shifting data) efficiently, able to be decoded visually or with automated image analysis software. This evolving post will document the thought process and development behind AJ4VD’s Frequency Shift Keying method, vdFSK.

Briefly, this is what my idea is. Rather than standard 2-frequencies (low for space, high for tone) QRSS3 (3 seconds per dot), I eliminate the need for pauses between dots by using 3 frequencies (low for a space between letters, medium for dot, high for dash). The following images compare my call sign (AJ4VD) being sent with the old method, and the vdFSK method.

traditional
Traditional QRSS

Again, both of these images say the same thing: AJ4VD, (.- .— ….- …- -..). However, note that the above image has greater than a 3 second dot, so it’s unfairly long if you look at the time scale. Until I get a more fairly representative image, just appreciate it graphically. It’s obviously faster to send 3 frequencies rather than two. In my case, it’s over 200% faster.

modulation
VD FSK idea

This is the code to generate audio files converting a string of text into vdFSK audio, saving the output as a WAV file. Spectrographs can be created from these WAV files.

### generate_audio.py ###
# converts a string into vdFSK audio saved as a WAV file

import numpy
import wave
from morse import *

def makeTone(freq,duration=1,samplerate=5000,shape=True):
    signal = numpy.arange(duration*samplerate)/float(samplerate)*float(freq)*3.14*2
    signal = numpy.sin(signal)*16384
    if shape==True: #soften edges
        for i in range(100):
            signal[i]=signal[i]*(i/100.0)
            signal[-i]=signal[-i]*(i/100.0)
    ssignal=''
    for i in range(len(signal)): #make it binary
        ssignal += wave.struct.pack('h',signal[i])
    return ssignal

def text2tone(msg,base=800,sep=5):
    audio=''
    mult=3 #secs per beep
    msg=" "+msg+" "
    for char in msg.lower():
        morse=lookup[char]
        print char, morse
        audio+=makeTone(base,mult)
        for step in lookup[char]:
            if step[0]==".":
                audio+=makeTone(base+sep,int(step[1])*mult)
            if step[0]=="-":
                audio+=makeTone(base+sep*2,int(step[1])*mult)
            if step[0]=="|":
                audio+=makeTone(base,3*mult)
    return audio

msg="aj4vd"
file=wave.open('test.wav', 'wb')
file.setparams((1, 2, 5000, 5000*4, 'NONE', 'noncompressed'))
file.writeframes(text2tone(msg))
file.close()

print 'file written'

And the other file needed…

### morse.py ###
# library for converting between text and Morse code
raw_lookup="""
a.- b-... c-.-. d-.. e. f..-. g--. h.... i.. j.--- k-- l.-.. m--
n-. o--- p.--. q--.- r.-. s... t- u.- v...- w.-- x-..- y-.-- z--..
0----- 1.---- 2..--- 3...-- 4....- 5..... 6-.... 7--... 8---.. 9----.
..-.-.- =-...- :---... ,--..-- /-..-. --....-
""".replace("n","").split(" ")

lookup={}
lookup[" "]=["|1"]
for char in raw_lookup:
    """This is a silly way to do it, but it works."""
    char,code=char[0],char[1:]
    code=code.replace("-----","x15 ")
    code=code.replace("----","x14 ")
    code=code.replace("---","x13 ")
    code=code.replace("--","x12 ")
    code=code.replace("-","x11 ")
    code=code.replace(".....","x05 ")
    code=code.replace("....","x04 ")
    code=code.replace("...","x03 ")
    code=code.replace("..","x02 ")
    code=code.replace(".","x01 ")
    code=code.replace("x0",'.')
    code=code.replace("x1",'-')
    code=code.split(" ")[:-1]
    #print char,code
    lookup[char]=code
produced

Automated decoding is trivial. The image above was analyzed, turned into the image below, and the string (AJ4VD) was extracted:

### decode.py ###
# given an image, it finds peaks and pulls data out
from PIL import Image
from PIL import ImageDraw
import pylab
import numpy

pixelSeek=10
pixelShift=15

def findPeak(data):
	maxVal=0
	maxX=0
	for x in range(len(data)):
		if data[x]>maxVal:
			maxVal,maxX=data[x],x
	return maxX

def peaks2morse(peaks):
	baseFreq=peaks[0]
	lastSignal=peaks[0]
	lastChange=0
	directions=[]
	for i in range(len(peaks)):
		if abs(peaks[i]-baseFreq)<pixelSeek:
			baseFreq=peaks[i]
		if abs(peaks[i]-lastSignal)<pixelSeek and i<len(peaks)-1:
			lastChange+=1
		else:
			if abs(baseFreq-lastSignal)<pixelSeek:c=" "
			if abs(baseFreq-lastSignal)<pixelSeek:c=" "
			if abs(baseFreq-lastSignal)<pixelSeek:c=" "
			directions.append([lastSignal,lastChange,baseFreq,baseFreq-lastSignal])
			lastChange=0
		lastSignal=peaks[i]
	return directions

def morse2image(directions):
	im=Image.new("L",(300,100),0)
	draw = ImageDraw.Draw(im)
	lastx=0
	for d in directions:
		print d
		draw.line((lastx,d[0],lastx+d[1],d[0]), width=5,fill=255)
		lastx=lastx+d[1]
	im.show()

im=Image.open('raw.png')
pix=im.load()
data=numpy.zeros(im.size)
for x in range(im.size[0]):
	for y in range(im.size[1]):
		data[x][y]=pix[x,y]

peaks=[]
for i in range(im.size[0]):
	peaks.append(findPeak(data[i]))

morse=peaks2morse(peaks)
morse2image(morse)
print morse