Raspberry Pi and HID Omnikey 5321 CLI USB

I recently come across a project where I needed to interact with some RFID tag. I wanted to retrieve the Unique ID of the each badge. I had absolutely no information on the badge except the string “HID iClass” written on it.

I start doing some research and found out that there are 2 big frequencies used in RFID: 125 kHz and 13.56 MHz. The iClass seems mainly based on the 13.56 MHz standard so I decided to go for a reader on this frequency.

Then I found out that there are several standard on this frequency. The most used are (in order) ISO 14443A, ISO 14443B, and ISO 15693. Nevertheless the iClass type includes several tag variations with all these standards. Finally I decided to buy the ADA fruit reader which handles both ISO 14443A and B: https://www.adafruit.com/products/364

I set it up with a Raspberry Pi 2 and was able to read the TAG send with the reader but sadly not the tag I wanted to read… Since I was unable to read my tag I guess they are using the third protocol: ISO 15693.

I look for some reader for the ISO 15693 but the choice is very limited (since it is not widely use). In the meantime I found a cheap HID reader on amazon (https://www.hidglobal.fr/products/readers/omnikey/5321-cli) which should be compatible with HID iClass card so I decided to buy it.

It works pretty well on Windows with their driver and software and gives me some useful information about my badge. It allowed me to confirm that it use the ISO 15693 standard:

It’s a good start nevertheless I wanted to use it on Raspberry Pi. I decided to do some research and found out that this type of RFID card reader is called “PCSC”:

PC/SC (short for “Personal Computer/Smart Card”) is a specification for smart-card integration into computing environments. (wikipedia)

Moreover there is a USB standard for such device: CCID.

CCID (chip card interface device) protocol is a USB protocol that allows a smartcard to be connected to a computer via a card reader using a standard USB interface (wikipedia)

Most USB-based readers are complying with a common USB-CCID specification and therefore are relying on the same driver (libccid under Linux) part of the MUSCLE project: https://pcsclite.alioth.debian.org/

There are plenty of soft related to RFID reading on Linux that I found during my research before choosing to try CCID. Here are my raw notes for future reference:

  • PCSC lite project
  • PCSC-tools
  • librfid
    • Seems dead
    • https://github.com/dpavlin/librfid
    • low-level RFID access library
    • This library intends to provide a reader and (as much as possible)
    • PICC / tag independent API for RFID applications
  • pcscd
  • libnfc
    • https://github.com/nfc-tools/libnfc
    • forum is dead
    • libnfc is the first libre low level NFC SDK and Programmers API
    • Platform independent Near Field Communication (NFC) library http://nfc-tools.org
    • libnfc seems to depend on libccid but it seems to depend on the hardware reader used :Note: If you want all libnfc hardware drivers, you will need to have libusb (library and headers) plus on *BSD and GNU/Linux systems, libpcsclite (library and headers).Because some dependencies (e.g. libusb and optional PCSC-Lite) are used
  • Opensc

I decided to go with the MUSCLE project available here: https://pcsclite.alioth.debian.org/ccid.html

After I installed the driver/daemon and the tools to interact with the reader I had trouble since the reader was not detected by pcscd. Luckily there is a section “Check reader’s compliance to CCID specification” on the pcsc page to know if the driver is supported. I follow it and send the repport to the main maintainer of pcsc driver: Ludovic Rousseau.

He confirms me that the driver was never tested with this driver and give me the instruction to try it :

Edit the file CCID/readers/supported_readers.txt and add the line:
0x076B:0x532A:5321 CLi USB
Then (re)install the CCID reader and try again to use the reader.
https://ludovicrousseau.blogspot.fr/2014/03/level-1-smart-card-support-on-gnulinux.html

I follow it and the reader gets detected by the daemon. Nevertheless the card is not detected so I provided more feedback/logs to Ludovic for debugging and sadly the result is that the reader cannot be supported:

The conclusion is that this reader is not CCID compliant. I am not surprised by this result.
You have to use the proprietary driver and no driver is provided for RaspberryPi.
If you are looking for a contactless reader have a look at https://pcsclite.alioth.debian.org/select_readers/?features=contactless

I will try to see if I can interact with the reader and libusb and also found a cheap open source ISO 15693 reader to continue this project.

Update 23JAN2017

I contact Omnikey to have support to use their reader for my project and they confirmed there is no driver on the Pi for it.

we don’t have any drivers for 5321 CLi on Raspberry Pi. Please have a look at OMNIKEY 5022 or OMNIKEY 5427 CK instead. The can be accessed through native ccidlib.

In the meantime I also bought another reader compatible with the ISO standard 15693: http://www.solutions-cubed.com/bm019/

I plug it with an Arduino Uno thanks to their blog article : http://blog.solutions-cubed.com/near-field-communication-nfc-with-the-arduino/

Nevertheless I was still unable to read the TAGS. I start doing deeper research and found that the ISO 15693 can have several settings and I do not know which one my iClass tags are using. I tried all the possible combinations that the BM019 handle:

Even with all the tests I made I’m still unable to read them. I dig deeper and found out that the BM019 module is built around the CR95HF ST chip. It seems that I’m not the only one trying to read Icalss with their IC and their support forum has several post explaining that it is not possible since iClass do not properly follow the ISO 15693 standard:

issue comes from Picopass which is not ISO 15693 complliant  ,
timing are not respected . 
We have already implemented a triccky commannd which allow us to support Picopass , a new version of CR95HF devevelopment softaware will be soon available including a dedicated window for PICOPASS .

After 3 readers and countless hours of attempt I’m still unable to read the iClass badges since they do not seems to implement any real standard.

Electric Train V3

Feel free to have a look on the V2 first:
http://djynet.net/?p=759

The biggest change is in the camera able to follow the phone orientation to update its angle. I also replace the front/rear sensors.

TrainV3

Camera tracking

I used the html5 API to detect the phone orientation:

if (window.DeviceOrientationEvent) {
  // Our browser supports DeviceOrientation
  window.addEventListener("deviceorientation", deviceOrientationListener);
} else {
  console.log("Sorry, your browser doesn't support Device Orientation");
}
function deviceOrientationListener(event) {
  var c = document.getElementById("myCanvas");
  var ctx = c.getContext("2d");

  ctx.clearRect(0, 0, c.width, c.height);
  ctx.fillStyle = "#FF7777";
  ctx.font = "14px Verdana";
  ctx.fillText("Alpha: " + Math.round(event.alpha), 10, 20);
  ctx.beginPath();
  ctx.moveTo(180, 75);
  ctx.lineTo(210, 75);
  ctx.arc(180, 75, 60, 0, event.alpha * Math.PI / 180);
  ctx.fill();

  ctx.fillStyle = "#FF6600";
  ctx.fillText("Beta: " + Math.round(event.beta), 10, 140);
  ctx.beginPath();
  ctx.fillRect(180, 150, event.beta, 90);

  ctx.fillStyle = "#FF0000";
  ctx.fillText("Gamma: " + Math.round(event.gamma), 10, 270);
  ctx.beginPath();
  ctx.fillRect(90, 340, 180, event.gamma);
  
  var aMsg = event.alpha.toString()+"_"+event.beta.toString()+"_"+event.gamma.toString();
  console.log("aMsg" + aMsg);
  doSend(aMsg);
}

Which send the 3 orientation information to the Tornado python server running on the Raspberry pi of the train. First I was doing JSON REST call to send the string containing the information but it was too slow to have the camera moving in real time. This was the perfect opportunity to use websocket for more real time communication.


function onOpen(evt) { 
        console.log("CONNECTED");
        doSend("Hi there!");
    }
    function onClose(evt) { 
        console.log("DISCONNECTED");
    }
    function onMessage(evt) { 
        console.log('message: ' + evt.data);
    }
    function onError(evt) { 
        writeToScreen('error' + evt.data);
    }
    function doSend(message) { 
        websocket.send(message);
    }
function testWebSocket() {
        websocket.onopen = function(evt) { onOpen(evt) };
        websocket.onclose = function(evt) { onClose(evt) };
        websocket.onmessage = function(evt) { onMessage(evt) };
        websocket.onerror = function(evt) { onError(evt) };
}
        

if (!'WebSocket' in window){
    console.log("Sorry, your browser doesn't support Websockets");
} else {
var wsUri = "ws://192.168.10.1:80/ws";
var websocket = new WebSocket(wsUri);
    testWebSocket();
}

Which is received on the server side and put in a variable (see the class Handler_WS) :

    def on_message(self, iMessage):
        """Methode call when the server receive a message"""
        logging.info('Receive incoming message:'+str(iMessage))
        #self.write_message("toto")
        self.aTrainRef._cellAngles=str(iMessage)

This variable is then read every 125ms by the “foo” function:

tornado.ioloop.PeriodicCallback(lambda: foo(aTrain), 125).start()

At the end the real method called is in charge of updating the turret position. The whole stuff is based on an existing framework called servoBlaster which will take care of driving the Servo.

def updateTurretFromScreenAngle(self):
        if (self._cellAngles!=""):
            #Update Gamma
            aGamma = self._cellAngles.split("_")[2]
            aGammaF = float(aGamma)
            aGammaI = int(aGammaF)
            aGammaisNegative = False
            if (aGammaI<0): aGammaI=(aGammaI*-1)-40 aGammaisNegative = True else: aGammaI=140-aGammaI if ((aGammaI>0)and(aGammaI<100)):
                self._turretHeight = 100 - aGammaI
                self.sendPos(ConstModule.CONST_SERVO_HEIGHT,self._turretHeight)
            #Update Alpha
            ...

Servo Blaster is library able to drive Servo on the Raspberry pi using software PWM. It is pretty hard to do since the Pi is not running a real-time OS. It relies on very low level interruption to ensure the timing needed to have a proper PWM are respected. You can have more info on it here:

https://github.com/richardghirst/PiBits/tree/master/ServoBlaster

It basically start a daemon (which I added in the crontab to be launch at boot time) on which you can interact with writing the desired position of each servo in /dev/servoblaster like:

echo 3=120 > /dev/servoblaster 

I also used servo blaster to send PWM info to the motor driver to change the train speed (since this functionality was broken when I moved from Arduino to Rapsberry Pi).

Contact sensors

I replace the old contact sensor by some new sensor able to detect an incoming obstacle before impact.

TrainSensorNew

They are still binary sensors that will turn high if they detect an obstacle but they have a wider range between 2 and 10 centimeters. This allows the train to detect incoming obstacle and stop before hitting it. The sensor is available on ADAfruit:
https://www.adafruit.com/products/1927

Demo

I made some videos on this new version on YouTube:

Code

As always the code is available here:
https://bitbucket.org/charly37/train/overview

Electric train V2

Following the first version of the electric train : http://djynet.net/?p=731

After some weeks of works I’m proud to announce the Version 2 of the electric train:

TrainV2

Wifi capabilities

The train can now be control with Wifi. It creates a wifi hotspot at boot time allowing people to connect to access a UI with some commands. The Wifi hotspot creation is describe HERE

Web

The train now offers a Web UI which allows controlling it and seeing the camera broadcast. The UI is done in Angular JS (with Bootsrap Angular UI). The Web server used to render the page is a python one : Tornado.

In addition of the UI it offers REST API to control the train (which are called today by the UI but could be used for a native Android application). The Web creation setup is detail HERE (TODO).

Embedded camera

The train is now equipped with a camera (the official Raspberry camera). The camera stream is broadcast and available on the train Web UI. The camera broadcast setup is describe HERE

Raspberry Pi brain

I replace the Arduino board with a Raspberry Pi A+. This extra boost of power was needed to broadcast the camera stream and create a wifi hotspot.

UBEC Power source

The biggest surprise I had when creating the new version was lot of unexpected Raspberry Pi reboot. Every time I was starting to move the train the Raspberry Pi was rebooting. I quickly suspect it was due to the motor which either took too much current or create perturbation that the 7805 cannot handle by itself. I done some research to understand how this issue was usually handle in R/C world and find out that they already have the perfect solution : BEC.

It is used to power the command part of the RC model from the same source than the motor. It provides a smooth tension and is able to absorb the impact of the motors on the power source with use of self and capacitor (wikipedia link). Since it is standard component in R/C world you can buy them pretty easily on the Web :

UBEC

The final result is visible in this video : TODO

Raspberry Pi Wifi hotspot

I need my Raspberry Pi to create its own private dedicated Wi-Fi network so that people can connect on it an access some service it provide (like camera broadcast).

To do so I’ve done some search and find out several tutorial to do it (see links at the end of the post). This post is just a sum up of what worked in my case (in case I need to redo it). I strongly suggest to check the links at the end of the article.

The solution rely on 2 software:

  • hostapd
    HostAPD is a user space daemon for access point and authentication servers. That means it can will turn your Raspberry Pi into an access point that other computers can connect to. It will also handle security such that you can setup a WiFi password.
  • isc-dhcp-server
    isc-dhcp-server is the Internet Systems Consortium’s implementation of a DHCP server. A DHCP server is responsible for assigning addresses to computers and devices connection to the WiFi access point.
    Some people use udhcpd which is lighter version

The first thing to do is install the software :

sudo apt-get update
sudo apt-get install isc-dhcp-server hostapd

DHCP configuration

Then configure the DHCP server with 2 files :

  • /etc/default/isc-dhcp-server
#This will make the DHCP server hand out network addresses on the wireless interface
Change “INTERFACES=""” to “INTERFACES="wlan0"”
  • /etc/dhcp/dhcpd.conf

comment the following lines :

option domain-name "example.org";
option domain-name-servers ns1.example.org, ns2.example.org;

make the DHCP as master on the domain by removing the comment at lines

#authoritative;

define a network and dhcp config by adding the following block (This configuration will use the Google DNS servers at 8.8.8.8 and 8.8.4.4. )

subnet 192.168.10.0 netmask 255.255.255.0 {
range 192.168.10.10 192.168.10.20;
option broadcast-address 192.168.10.255;
option routers 192.168.10.1;
default-lease-time 600;
max-lease-time 7200;
option domain-name "local-network";
option domain-name-servers 8.8.8.8, 8.8.4.4;
}

Network interface

Now that the DHCP server is configured we will setup the network card (wifi dongle in our case) with static IP

edit the file :

/etc/network/interfaces

remove everything related to “wlan0” and past :

allow-hotplug wlan0

iface wlan0 inet static
address 192.168.10.1
netmask 255.255.255.0

and now the last configuration step is the hostapd server

HostApd

create new file:

/etc/hostapd/hostapd.conf

past :

interface=wlan0
driver=nl80211
#driver=rtl871xdrv
ssid=MyPi
hw_mode=g
channel=6
macaddr_acl=0
auth_algs=1
ignore_broadcast_ssid=0
wpa=2
wpa_passphrase=raspberry
wpa_key_mgmt=WPA-PSK
wpa_pairwise=TKIP
rsn_pairwise=CCMP

Be aware of the driver choice : nl8021.

Then the last file to configure :

/etc/default/hostapd

Find the line #DAEMON_CONF=”” and edit it so it says DAEMON_CONF=”/etc/hostapd/hostapd.conf”

Don’t forget to remove the # in front to activate it!

Then you can test with :

sudo /usr/sbin/hostapd /etc/hostapd/hostapd.conf

If it does not works with an error related to the driver….you need to DL the one made by adafruit (see llinks at the end)

Extra

To start the 2 services :

sudo service isc-dhcp-server start
sudo service hostapd-server start

and if you want to start them at boot time :

sudo update-rc.d isc-dhcp-server enable
sudo update-rc.d hostapd enable

Useful links found in my research:

Broadcast Raspberry Pi camera

I need to broadcast the stream of my Raspberry pi camera mounted in front of the train. More info on the “train” project here (part1) and here TODO

PHOTO TODO

This is the results of my search on the possible solutions :

motion

more for security or motion detection

with VLC

The camera stream is send to vlc which forward it over the network

raspivid -o - -t 0 -hf -w 640 -h 360 -fps 25 | cvlc -vvv stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554}' :demux=h264

-Slow, Delay

+Easy

Direct capture with the new recent v4l2 driver

cvlc v4l2:///dev/video0 --v4l2-width 1920 --v4l2-height 1080 --v4l2-chroma h264 --sout '#standard{access=http,mux=ts,dst=0.0.0.0:12345}'

+Easy

-Require VLC

with ffmpeg//ffserver

https://www.ffmpeg.org/

A complete, cross-platform solution to record, convert and stream audio and video.

–The stream is capture and stream by ffmpeg

raspivid -n -vf -hf -t 0 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vcodec copy -an -metadata title="Streaming from raspberry pi camera" -f flv $RTMP_URL/$STREAM_KEY

with MJPG streamer

MJPG-streamer takes JPGs from Linux-UVC compatible webcams, file system or other input plugins and streams them as M-JPEG via HTTP to web browsers

http://sourceforge.net/projects/mjpg-streamer/

with gstreamer

GStreamer is a library for constructing graphs of media-handling components. The applications it supports range from simple Ogg/Vorbis playback, audio/video streaming to complex audio (mixing) and video (non-linear editing) processing.

http://gstreamer.freedesktop.org/

RPi caminterface

RaspiMJPEG is an OpenMAX-Application based on the mmal-library, which is comparable to RaspiVid. Both applications save the recording formated as H264 into a file

http://elinux.org/RPi-Cam-Web-Interface

WebRTC UV4L

WebRTC is a very powerful standard, modern protocol and gives a number of nicefeatures

http://www.linux-projects.org/modules/news/article.php?storyid=174

-only works with pi2

picamera

The project consists primarily of a class (PiCamera) which is a re-implementation of high-level bits of the raspistill and raspivid commands using the ctypes based libmmal header conversion, plus a set of encoder classes which re-implement the encoder callback configuration in the aforementioned binaries. Various classes for specialized applications also exist (PiCameraCircularIO, PiBayerArray, etc.)

https://picamera.readthedocs.org/en/release-1.10/install2.html

avconv

avconv is a very fast video and audio converter that can also grab from a live audio/video source. It can also convert between arbitrary sample rates and resize video on the fly with a high quality polyphase filter.

CRTMPServer

http://www.rtmpd.com/

crtmpserver it is a high performance streaming server able to stream (live or recorded)

live555

http://www.live555.com/liveMedia/

Not tested

PiStreaming

https://github.com/waveform80/pistreaming

Not tested


Finally I decided to use “RPi caminterface” which has the best feedback. I confirms it works out of the box (which save me lot of times).
I could maybe migrate to “WebRTC UV4L” if I decide to go to a Pi2 in the future…


 

Here are the links I find during my search :

Wake up by LAN using Raspberry Pi

J’ai décidé d’ajouter une fonctionnalité à mon ensemble domotique pour pouvoir allumer mon PC fixe depuis mon site web (hébergé sur mon raspberry Pi).

Rien de plus simple car il suffit d’utiliser le binaire linux etherwake disponible sur Raspbian

sudo apt-get install etherwake

puis :

/usr/sbin/etherwake 20:cf:30:ca:8a:50 (avec l'adresse MAC de la carte réseau du PC fixe).

Il faut aussi penser à bien configurer le BIOS du PC fixe pour supporter le WOL. Pour la partie WEB j’ai simplement ajouté un cas dans la passerelle :

case "CMD_WOL" :
            $aCommandToExecute = 'sudo /usr/sbin/etherwake 20:cf:30:ca:8a:50';
            echo exec($aCommandToExecute);
            break;

et ajouté un bouton :

et la callback :

$("#Bopen").click(function() 
        {
            $.ajax(
            {
                type: "POST",
                   url: "http://82.227.228.35/Sender/XbeeWrapper.php",
                   data: ({iCmdType : "CMD_WOL"}),
                   cache: false,
                   dataType: "text",
                   success: onSuccess
            });
        });
...
<input id="Bopen" type="button" name="open" value="Wake up PC"/>