Saturday, March 03, 2007

Send key presses to mythfrontend telnet control

Here is a simple python script that simplifies controlling mythfrontend over a network. It simply captures key presses and sends them to mythfrontend using the telnet interface 'key ...' command.



#!/usr/bin/python
#
# Send control keys to mythfrontend. Remember to enable network control
# in the mythfrontend settings.
#
# Ben Anhalt 03/03/07
#

import curses
import curses.wrapper
import curses.ascii
import telnetlib
import sys



key_map = {ord('\\') : 'backslash',
curses.KEY_BACKSPACE : 'backspace',
ord('[') : 'bracketleft',
ord(']') : 'bracketright',
ord(':') : 'colon',
curses.KEY_DOWN : 'down',
ord('\n') : 'enter',
ord('=') : 'equal',
curses.ascii.ESC : 'escape',
curses.KEY_F1 : 'f1',
curses.KEY_F10 : 'f10',
curses.KEY_F11 : 'f11',
curses.KEY_F12 : 'f12',
curses.KEY_F2 : 'f2',
curses.KEY_F3 : 'f3',
curses.KEY_F4 : 'f4',
curses.KEY_F5 : 'f5',
curses.KEY_F6 : 'f6',
curses.KEY_F7 : 'f7',
curses.KEY_F8 : 'f8',
curses.KEY_F9 : 'f9',
ord('>') : 'greater',
curses.KEY_LEFT : 'left',
ord('<') : 'less',
curses.KEY_NPAGE : 'pagedown',
curses.KEY_PPAGE : 'pageup',
curses.KEY_RIGHT : 'right',
ord(';') : 'semicolon',
ord('/') : 'slash',
ord(' ') : 'space',
curses.KEY_UP : 'up'}

def main(stdscr, tn):
stdscr.scrollok(1)

while 1:
c = stdscr.getch()

if c == ord('q'):
break

try:
send = 'key ' + key_map[c] + '\n'

except KeyError:
if curses.ascii.isalnum(c):
send = 'key ' + chr(c) + '\n'
else:
send = '\n'

tn.write(send)
stdscr.addstr('Sent: ' + send)

#recv = tn.read_some()
#stdscr.addstr('Recv: ' + recv)



try:
host = sys.argv[1]
except IndexError:
print """Usage: %s HOST [PORT]
HOST - Address of host running mythfrontend.
PORT - Mythfrontend control port.""" % sys.argv[0]

sys.exit()

port = 6546
if len(sys.argv) > 2:
port = int(sys.argv[2])


tn = telnetlib.Telnet(host, port)
tn.read_until('#')

curses.wrapper(main, tn)

tn.close()

Monday, August 07, 2006

Brakes

I had to work on the brakes on Carissa's car this week. I replaced the brake shoes and wheel cylinders on both sides and a little piece of brake line on one side. After I finished I took a shower and my hands pruned-up unbelievably. I think the brake fluid dissolves the water resistant oils on the skin allowing the cells to suck up water super fast. No telling what else the absorbed brake fluid does to your body. I'm sure my liver loves me.

Wednesday, May 10, 2006

Dishwasher

Had to install a new dishwasher this past week. The old one has started to leak. I had hoped that it was just leaking around where the drain hooked up, but the seal around the pump shaft had gone. We replaced it with a Kenmore Elite that was on close-out at Sears. It seems pretty good so far. The installation was fairly straight forward. I had to cut a new hole in the floor to bring the water supply up in the right place. Also needed a new hole in the side of the cabinet for the drain hose. The installation instructions called for teflon tape on the threads where the water supply hooks up to the inlet valve. I put about four rounds of tape on and snugged it down pretty tight. After turning on the water there was slow seep, though. So, I switched to some thread compound and that seemed to do the trick. Is there some trick to the teflon tape I don't know? I have never had much luck with it.

Tuesday, May 09, 2006

Mushroom Tomato Soup

Here is a recipe for spicing-up standard condensed tomato soup:











1Tbspbutter
1/8cupchopped onion
1clovechopped garlic
1 1/2cupsliced mushrooms (about 3 oz by weight)
3/4tspbasil
1tsporegano
1cupmilk
1cancondensed tomato soup (10 3/4 oz)
1/2tspblack pepper
1/2cupchopped green bell pepper (about 1/4 of a whole pepper)

Saute the onions and garlic in the butter in a medium sauce pan over medium heat until onions begin to turn translucent. Add mushrooms and saute for 3-4 more minutes. Add basil and oregano and saute one more minute. Add milk and condensed soup and bring to low boil. Add black pepper and chopped green bell pepper. Simmer for 10-20 minutes.

This recipe is a good way to use up left over mushrooms and green peppers since the amounts are not critical.

Monday, May 08, 2006

The not uncommon failings of Wikipedia

In my last post, nearly a month ago now, I began discussing an article by Jason Scott called The Great Failure of Wikipedia. While I agree with most of his specific criticisms, the overall analysis seems a little off. I certainly don't believe that Wikipedia can be called a failure. I think the value of Jason's article is that it illustrates that we don't really know what the hell Wikipedia is for or even what it is.

The main troubles Jason seems to point to are:

  1. Wikipedia is not a good place for content creators.
  2. Experts are not respected on Wikipedia.
  3. The Wikipedia process is inefficient.


For example he writes:


This is what the inherent failure of wikipedia is. It's that there's a small set of content generators, a massive amount of wonks and twiddlers, and then a heaping amount of procedural whackjobs. And the mass of twiddlers and procedural whackjobs means that the content generators stop being so and have to become content defenders. Woe be that your take on things is off from the majority. Even if you can prove something, you're now in the situation that anybody can change it. And while that's all great in a happy-go-lucky flower shower sort of way, it's when you realize that the people who are going to change it could have absolutely no experience with the subject whatsoever, then you see where we are.


I think that the true value of Wikipedia, what it is for, is to expose and democratize the process of adding new information into a sort of aggregate knowledge of society. Notice, I don't say "sum of human knowledge". The sum of human knowledge contains not only them gems but also every religious dogma, every crack-pot theory, every single hussied-up marketing bullshit known to man. How am I, Joe Sixpack, to know that global climate change is real, but magnet therapy is bogus? I can find experts to support each, I'm sure. Nevertheless, there is a sort of progress to the state of aggregate knowledge; it is now widely accepted that the earth is indeed round. Wikipedia is the online embodiment of the process by which this glomming-on of esoteric bits of information to the set of aggregate knowledge occurs.

Let's analyze the such a process and see what properties it would possess. First, there are good models of such a process. Jason describes Wikipedia as having "all the politics and turf war of Ivory Tower Academia without the mitigating barrier of time to cool down or consider." I would definitely agree that there are turf wars and politics in the world of academia, but I'm not so certain that the time between committee meetings is used to cool down and consider as it is to stew and plot. "The main difference between [Wikipedia] and other similar academic environments is the pure speed at which stuff can happen...," writes Jason. I think this is true. I also think that, eventually, the academic process turns out a product that is better than what it started with. It is an excruciating process; it grinds people up. I have seen it myself. But, in the end, this kind of process is the best we have come up with for getting at knowledge. If Wikipedia represents a speeding-up of this process, that should be a good thing in terms of the amount of information at can be added to the set of aggregate knowledge. It is not so good in terms of grinding people up, but the fact that the Wikipedia process is fueled by amateurs means that people can truly take breaks, unlike real academians who have to settle for sabbaticals spent working in industry.

This gets straight to the question of inefficiency. The amount of apparent waste in any knowledge gathering process always seems staggering. This seems to be true of any deliberative process. Look at Congress or any parliament. I think our notions about knowledge are what lead us to be surprised by this state of affairs. We say things like "the sum of human knowledge," and imagine some great library. In reality, human knowledge is an ecosystem, and its advancement is a process of evolution. It is typified by waste and redundancy. Ideas grow at the expense of others, only to be consumed themselves. We have a beautiful canopy of "great truths" under which grow small ideas and tentative theories which fight amongst one another. Look here: quantum gravity just gobbled up one the weak offspring of string theory. At the very bottom there is a layer of rotting misconceptions and wild hypotheses, out of which will grow future great ideas.

If Wikipedia is the jungle, the content creator is like a farmer. You do not plant your seeds in the middle of the teeming forest. You find a clearing where you can nurture your idea as it grows. If it becomes strong enough, perhaps it can make it on its own. Likewise, if you are doing original research, Wikipedia is not going to be a good environment. As Jason notes, "[There] you are a content defender and that means that time you could be spending finding new and interesting facts or finding original sources or otherwise making the world a better place (or at least an entry or two) is being spent explaining for the hundredth time that no, this really happened and yes, I got clearance for that photograph, and yes, I believe this shows a neutral point of view, and so on."

Finally, we come to the question of experts. There are a lot of things I would like to say here. I am the sort of person who severely distrusts authority. But for now, I will just point out that from the vantage point of trying to put together a collection of established information, the concept of experts is essentially useless. You can find experts to back any statement. Here, people will say one of two things: either check-up on your expert in some way, or look for experts who are respected by their community. But these responses simply beg the question. In the first case, I am forced to become an expert myself, obviating the need for the expert at all. In the second case, I still face the same question; only now it is which group of experts I should credit since for any question that has not already been settled, there will always be multiple schools of thought.

I guess I could sum up my position by saying that Wikipedia is the Worse is Better approach to online encyclopedia building, while recalling the adage: don't let the perfect become the enemy of the good.

Thursday, April 13, 2006

Jason Scott's "The Great Failure of Wikipedia"

I just read Jason Scott's The Great Failure of Wikipedia. There is also audio from a talk he gave on this subject which I have downloaded but not yet listened to.

I am deeply fascinated by the Wikipedia phenomenon, and Jason Scott's article is very interesting. My initial reaction is general disagreement with his conclusions. I want to think about it a bit more and listen to the audio before writing too much. But, I also want to outline a couple ideas before I forget them.


  • The consumer vs. content creator dichotomy.

  • The wasted effort argument.

  • Is wikipedia a place for "content" creation or a place for "content" digestion?

  • The question of authority.



These are a few issues I would like to analyze at some point. This is mostly a reminder to myself. But if anyone ever bothers to read this, you'll know what to expect in the coming days.

Friday, April 07, 2006

Ettercap

I spent some time today playing around with Ettercap, the ARP poisoning tool. (On our own network, of course.) Sometimes it is fun to watch network traffic with e.g. Ethereal, but if you use a switched network, you can only see the traffic coming and going to the machine doing the sniffing. Ettercap allows you to spoof the ARP entries in other machines on the network so that their traffic is redirected through your sniffer. Today, I was using tulip (running Debian) as the sniffer and targeted harpsichord (running XP) to see if it would work. I was easily able to capture http traffic to and from harpsichord. Happily, I didn't see any passwords go by in the clear. Of course, just a cursory examination proves little. I was a little confused when I opened a samba share that resides on daisy without seeing any traffic being logged on the sniffer. How were those samba packets getting around the ARP poisoning? I'm embarrassed it took me a while to realize that I had only poisoned harpsichord and the gateway. Since daisy and harpsichord are on the same subnet, they communicate without the aid of the gateway. Poisoning daisy, the smb traffic was revealed.

I have in the past noticed the number of unsolicited packets coming off the Internet that get dropped by our firewall. Mostly, they are aimed at the windows file sharing ports. Supposedly these are due to worms attempting to infect other systems. Someday, I might try exposing an un-updated XP machine through the router's DMZ to see how long it takes to get infected. That's where Ettercap comes in. Right now, I don't have a computer that I can use as a guinea pig, though.

In other news, I am now using a shared apt cache to update my Debian/Testing systems. I just added a noauto entry to /etc/fstab that I can mount over /var/cache/apt/archives before running apt-get upgrade.

I finally fixed the power connector on the Sony laptop. It has a nonstandard barrel type connector. The ring contact consists of a little copper tab on the inside of the back of the jack. The rest of the jack is plastic, except for the tip contact. Over time, the ring contact had weakened and moved away from the inside of the jack until it could no longer make a good connection to the plug. I took the laptop apart, which is a pain in the ass, and unsoldered the power jack. To get the ring contact to extend further into the jack, I wedged a little bit copper wire between the contact and the housing. The plug now fits much more snuggly, and the connection seems good. Unfortunately, at one point I walked away to get a tool, and the damn soldering iron moved over and melted a hole in the plastic of the laptop's case. Luckily, it only did cosmetic damage. Not one of my best repair jobs, though.