A few interested netizens have been in touch regarding the Ennio doorbell. The device isn’t an impenetrable box quite the opposite. It’s built on top of OpenWRT by the looks of things and here’s what we know.
It’s core is a NixCore X1 (datasheet)
As per the datasheet UART PINS 39 and 40 can be used to connect to the device by connecting to a Raspberry Pi providing a serial console and flashing can be achieved via TFTPD.
The camera is a UVC camera which makes things fairly simple to get into.
What we need to achieve next is a full mapping of GPIO pins to input output functions; e.g. GPIO14 = Doorbell (just a guess).
I’m hoping to get some kind of build system up and running and a github project to host it. I’ll even write up some custom software so you can free your doorbell from whatever security problems exist on the other side of it (amazon cloud).
What needs GPIO mapping.
- Touch button
- Blue LED for button (is this controllable)
- Light Sensor
- IR LEDs
- Relay port
What we’re uncertain of
- Microphone input configuration
- Speaker output configuration
LG WebOS TV UF850V
I recently upgraded my TV on account of the noises the old one was starting to make – it served me well a good 7 years worth but had to go. My new one has all those wifi and smart features you’d expect so it makes sense to try and integrate it with Kodi or other things around the place using whatever tools available… In my case, almost always, python. 😉
So after a little fiddling and reverse engineering, scanning, and interrogating I put together this tool. LGWebOSRemote – https://github.com/klattimer/LGWebOSRemote
Most of the existing tools for python users weren’t functional, and some were down right wacky. It seems newer versions of LGWebOS don’t use HTTP/REST they favour websockets instead. Websockets are a pain in the backside, regardless the tool is here, it works (almost completely).
There’s an issue which appears to be the TV’s fault regarding the Wake on LAN feature timing out over wifi, this shouldn’t happen if you’re wired in thought.
I’ve got a whole bunch of other things I’m working on but nothing ready to release yet, trying to find time is difficult but hopefully some nice new code drops will come soon, otherwise keep checking my github page for things I’m working on.
I’ve been pulling together the parts to build a VR headset for less than 100 quid. This isn’t as easy as you might think. In order to achieve it, I’m using a VR Box to provide the headset and lenses which isn’t bad. For the screen and accelerometers I managed to pick up a HTC One M8 for £60 which was an absolute bargain. With this set up, the google goggles demos run very well. The 1080p screen on the HTC One M8 is perfect for the display. The grill between pixels is minimal compared to the Oculus DK2. It’s still not as good as an oculus in terms of accelerometer lag and lens quality, but it is significantly cheaper.
I spent some time reading this instructables now I’m all set to try out Kinoni with the headset.
In order to extend it’s abilities I’m adding IR reflectors to the headset to allow head tracking with a camera. The tracking would have to be based on brightest spots with an IR source and scotchlite reflectors. I can only really get x/y working for head tracking on a single standard desktop HD cam. This is due to the distance between the reflective spots on the headset being too small to estimate depth. By adding a second camera it would be easier to calculate the headset’s relative 3D position in space. If you were going to add a second camera, would it make sense to use a Kinect instead?
If you’ve read any of my previous posts on the topic I’m trying to get OOK, FSK and potentially PPM (Differential Pulse Position Modulation) working in the Linux kernel with the HopeRF RFM12B adapter. This is mostly for ARM SBC/SoC type situations like the RaspberryPi or beaglebone.
The intention here is to allow you to easily intercept, and transmit consumer wireless signals on the 434/868 MHz bands. There are existing ways to do this. However they all appear to depend on the Jeenode or at least an atmega chip running JeeLib.
I want to remove the dependency on the atmega, and yet still exploit the RFM12B to provide OOK/FSK transmission. Right now I’m adapting the existing RFM12B-Linux module to allow sending and receiving OOK signals, I’m also adding in the code to interact with specific defined devices. So far my thoughts are that it shouldn’t be too hard to have drivers for multiple devices in the module.
I thought best to do a link dump of everything that is currently important to this endeavour;
After merging someone else’s OOK sending efforts. I’m not too sure that listening for OOK and FSK is really important. After all an SDR can happily listen across the frequencies and decode the signals.
I unfortunately get very little time to work on this, or other projects on this blog but try to keep it updated with my experiments from time to time. Once I have my Salus under control I plan to release the branch on github. Until then I get a little time here and there to experiment. My latest outcomes have been hampered by insufficient power to my Pi3. There have also been some issues with transmitting on the 868 band.
I managed to get around to playing with the Ennio wifi doorbell a little more, trying to figure out how all of it works. It seems I have to learn a few things about UDP, however with a quick and dirty tcpdump on my openwrt router (which I was hacking in other ways earlier) to an NFS share on my RAID I managed to collect a chunk of worthwhile data while my phone interacted with the camera.
As far as I can tell without capturing all of the data of all of the interactions it goes something like this:
- Phone sends a broadcast request of some kind and the doorbell responds with a packet with it’s name to the UDP port specified by the initial contact.
- The phone logs into the device using the username and password provided by doing by sending a hex encoded ASCII string, with some preamble bytes:
Continue reading →
Here are the results of taking apart the Ennio wifi doorbell. I haven’t had a chance to dump the memory from this device yet, but I received a request to take some photos of the internals. Here are those photos;
The SoC is a RaLink RT5350, the data sheet for the chip can be found here. I guess some of the pins on the wide connector are for programming the memory chip (JTAG). I just need to work out which ones.
I’m looking for memory dump options, at first I wanted to mount my NFS server onto the device and dump to a file there, however that won’t work due to NFS being missing from the device. The other option is to dump it to telnet using base64 encoding, then decode it on the other side… This would be less than ideal but still possible. I need to boot the device up again to figure out what to do next.
Setting up a cross compile toolchain to build new binaries might be the best option for getting what I need out of this thing. Although I doubt the source code for the IPCam will be available to me.
The most important thing for me to do is to disable the wifi hardware. Once disabled it’s safe to actually put the hardware on the wall outside. I will likely need to take it down at some point, if I intend to flash the memory. I still wouldn’t recommend buying an Ennio wifi doorbell, or any of the variations out there. The failure point is the OS which is a black box of obscurity over security. I would love to have the time to develop a better open-source OS to run on these things which would work on foscam, wansview and others too. Like OpenWRT but for IPCams.
I recently purchased a Ennio Wifi Doorbell in order to have a doorbell, and also have an outdoor front of house security camera. It seemed like pretty much the only option available.
The camera is fairly easy to set up in the way it was intended. Install an app from the app store, and pair it (with magic, or zeroconf/bonjour) with your phone.
When the button is pushed a push notification arrives on your phone, I’m not quite sure how this happens yet but I’ll dig further. The camera sends a photo with the push notification and also allows streaming of video from the camera.
As far as I can tell both of these things work over 3G as well as Wifi flawlessly with a horrifying app UI.
First things first, is it secure?
With this thing going on the front of my house I want to know if it’s possible to break into it or in some other way use it to break into my network.
The short answer, this device is about as secure as a wet paper bag with a block of gold in it. This, although a major downside from a consumer perspective, leaves many open opportunities for the hardware hacker. Continue reading →
So a new Arduino arrived today after the connectors gave out on my older ones. It’s a cheap UNO clone and will be followed by a selection of Nano’s for various coming projects. I added support to my ESP8266 project for lightsOn, lightsOff, and setting a value for the lights on. Which can be found in the github repo. The server responds with a simple JSON string explaining the current state of the light, and can be adjusted by sending particular HTTP requests. Continue reading →
I’ve been busying myself with fixing up and adapting the RFM12B Linux driver. My first thought was simply to give people support for sending and listening for OOK signals as an extension, then taking the device support in the rtl_433 decoder to extend RFM12B driver to include lots of OOK device support for things like weather stations and energy monitors.
JeeLib already does most of this work on Arduino so for the most part this is simply a matter of joining lots and lots of code together from different places and making sure it sits right. I’ve decided that in order to do this it would probably be better to re-write the driver while trying to fix some of the original driver’s TODO list along the way.
The driver will loosely allow :-
- Compatibility with the original RFM12B driver & original JeeLib compatibility.
- Send OOK, FSK messages to devices.
- Listen for OOK, FSK messages from devices.
- Set tuning to a specific frequency.
Continue reading →