Featured

TensorFlow-Powered Vision For Pi-based robot

Introduction

This is a Pi-based robot to implement visual recognition(by Inception V3). The TensorFlow-Powered vision can recognize many objects such as people, car, bus, fruits, and so on.

  • Hardware: Raspberry-Pi2, Sony PS3 Eye Camera(Available to use Logitech C270 USB camera with Raspberry Pi)
  • Software: TensorFlow(v1.0.1), Jupyter-Notebook

Structure.png

My motivation

I was so curious about excellence of the image recognition with TensorFlow on Raspberry Pi. Also, the Jupyter notebook is very convenient to instantly code as a quick prototype. So, in terms of error rate of the image classification, Inception V3(3.46%) is more excellent than human(5.1%) whereas raspberry pi’s processing speed is very slow compare to my laptop.

(Table: Jefree Dean’s Keynote @Google Brain).

Chart_IR.png

  • Schematic diagram of Inception-v3

InceptionV3.png

Requirements and Installation

  • Install Webcam driver on your Rapsberry Pi.
sudo apt-get install fswebcam
  • Test your webcam.
fswebcam test.jpg

Quick Start

  • You should install both TensorFlow(v1.0.1) and Jupyter notebook on your Raspberry Pi.
  • First, clone the TensorFlow-Powered_Robot_Vision git repository here. This can be accomplished by:
cd /home/pi/Documents
git clone https://github.com/leehaesung/TensorFlow-Powered_Robot_Vision.git

next, cd into the newly created directory:

cd TensorFlow-Powered_Robot_Vision

Drive your jupyter notebook on your Raspberry Pi.

jupyter-notebook

The pre-trained data(inception_v3.ckpt) will automatically download when driving the Jupyter notebook. (Where: /pi/home/Documents/datasets/inception)

Source Codes

Results of Object Recognition

  • Wow! The result is really awessome!!

RecognitionResult.png

Reference

Featured

PID Control For CPU Temperature of Raspberry Pi

[ PID Control For CPU Temperature of Raspberry Pi ]

By in raspberry-pi

PID Control For CPU Temperature of Raspberry Pi
2
3
4
5
6
7

Introduction

My motivation for PID Control For CPU Temperature of Raspberry Pi came for many reasons such as very hot CPU, very noisy fan’s sound and fast battery consumption because the hot CPU makes the system really unstable while using Raspberry Pi for a long time. So, I have optimized the failing by using PID node on Node-RED. It’s visually helpful for a trainee to understand the PID control system for an educational purpose.

This will cover the basic steps that you need to follow to get started with open sources like PID node, MQTT node in the Node-RED. Also, it’s really painful and hard to tune 3 gains like KP, KI, and KD as manual tuning(Trial and error) method. There are many tuning methods such as Manual tuning, Ziegler–Nichols, Tyreus Luyben, Cohen–Coon, Åström-Hägglund and Software tools such as Simulink in Matlab or Excel PID Simulator (enclosed). I’ve already provided my source codes in the Download List but If you use a different fan, you should tune PID gains because most physical fan’s characteristics are different. You can get more information from the linked web (PID controller).

MQTT(Message Queueing Telemetry Transport) is a Machine-To-Machine(M2M) or Internet of Things (IoT) connectivity protocol that was designed to be extremely lightweight and useful when low battery power consumption and low network bandwidth is at a premium. It was invented in 1999 by Dr. Andy Stanford-Clark and Arlen Nipper and is now an Oasis Standard. I’ve already published about how to approach the MQTT below the linked webs.

http://www.instructables.com/id/Smart-JPEG-Camera-for-Home-Security/

http://www.instructables.com/id/Smart-Gas-Valve-Checker-for-Home-Safety/

Step 1: Table of Contents

Step 0: Introduction

Step 1: Table of Contents

Step 2: Bill of Materials

Step 3: Setting up an acrylic clear case, a fan with Raspberry Pi

Step 4: Programming NodeRED on Raspberry Pi2

Step 5: Setting up MQTT v3.1 on Raspberry Pi2

Step 6: Checking your NodeRED codes with MQTT on Raspberry Pi2

Step 7: Adding & Setting up PID node, Dashboard on Raspberry Pi2

Step 8: Using a dashboard for PID control

Step 9: Tuning PID controller

Step 10: Download list

Step 11: List of references

Step 2: Bill of Materials

Step 3: Setting up an acrylic clear case with a circuited fan with Raspberry Pi

8
9
10
11

Assembly steps

(1) I suggest you should use a breadboard before soldering and wiring all.

(2) Connect the Raspberry Pi2 with a PNP 1015 transistor, a fan, and a variable 102(1k) resistor as shown above in the circuit diagram.

(3) I used a glue gun to attach with the clear case.

(4) Lastly, connect a portable battery with Raspberry Pi2. (Use any portable battery to connect with the same size connector cable on Raspberry Pi2. )

Step 4: Programming NodeRED on Raspberry Pi2

12
13
14

How to start Node-RED on web-browser.

(1) Write down command shown below to a terminal window.

node-red-start

(2) You can find an IP address as below. ‘Once Node-RED has started, point a browser at http://169.254.170.40:1880’ (It depends on your IP address)

(3) Open your web browser.

(4) Copy the IP address and paste on web-browser.

(5) It will display a visual editor of Node-RED on web-browser.

(6) You can start coding with visual editor on web-browser.

(7) Try dragging & dropping any node from the left-hand side to right-hand side. It’s really easy to code. ( You can conveniently use the visual editor offline as well as online. ) Download the ‘PID_Control_For_CPU_TEM_ver0.5.txt’ file. (1) Click the number (1) at the right-hand side corner shown in NodeRED on web-browser. (2) Click the Import button on the drop down menu. (3) Open the Clipboard shown in the above 1st picture. (4) Lastly, paste the given JSON format text of ‘PID_Control_For_CPU_TEM_ver0.5.txt’ in Import nodes editor.

Step 5: Setting up MQTT v3.1 on Raspberry Pi2

Setting up MQTT v3.1 on Raspberry Pi2

This message broker(Mosquitto) is supported by MQTT v3.1 and it is easily installed on the Raspberry Pi and somewhat less easy to configure. Next we step through installing and configuring the Mosquitto broker. We are going to install & test the MQTT “mosquitto” on terminal window.

curl -O http://repo.mosquitto.org/debian/mosquitto-repo.gpg.key
sudo apt-key add mosquitto-repo.gpg.key
rm mosquitto-repo.gpg.key
cd /etc/apt/sources.list.d
sudo curl -O http://repo.mosquitto.org/debian/mosquitto-jessie.list
sudo apt-get update

Next install the broker and command line clients:

  • mosquitto – the MQTT broker (or in other words, a server)
  • mosquitto-clients – command line clients, very useful in debugging
  • python-mosquitto – the Python language bindings
sudo apt-get install mosquitto mosquitto-clients python-mosquitto

As is the case with most packages from Debian, the broker is immediately started. Since we have to configure it first, stop it.

sudo /etc/init.d/mosquitto stop

Now that the MQTT broker is installed on the Pi we will add some basic security.

Create a config file:

cd /etc/mosquitto/conf.d/
sudo nano mosquitto.conf

Let’s stop anonymous clients connecting to our broker by adding a few lines to your config file. To control client access to the broker we also need to define valid client names and passwords. Add the lines:

allow_anonymous false
password_file /etc/mosquitto/conf.d/passwd
require_certificate false

Save and exit your editor (nano in this case).

From the current /conf.d directory, create an empty password file:

sudo touch passwd

We will to use the mosquitto_passwd tool to create a password hash for user pi:

sudo mosquitto_passwd -c /etc/mosquitto/conf.d/passwd pi

You will be asked to enter your password twice. Enter the password you wish to use for the user you defined.

Testing Mosquitto on Raspberry Pi

Now that Mosquitto is installed we can perform a local test to see if it is working: Open three terminal windows. In one, make sure the Mosquitto broker is running:

mosquitto

In the next terminal, run the command line subscriber:

mosquitto_sub -v -t 'topic/test'

You should see the first terminal window echo that a new client is connected.In the next terminal, run the command line publisher:

mosquitto_pub -t 'topic/test' -m 'helloWorld'

You should see another message in the first terminal window saying another client is connected. You should also see this message in the subscriber terminal:

topic/test helloWorld

We have shown that Mosquitto is configured correctly and we can both publish and subscribe to a topic.When you finish testing all, let’s set up below that.

sudo /etc/init.d/mosquitto start

Step 6: Checking your NodeRED codes with MQTT on Raspberry Pi2

15
16
17

1819

When you will use the JSON format of the ‘PID_Control_For_CPU_TEM_ver0.5.txt‘ on Node-RED, it’s automatically set up & coded each data. I have already set up the each data in each node.

(1) Click each node.

(2) Check information inside each node has been prefilled.

(3) Please don’t change the set data. (The above can be customized for more advanced users.)

Step 7: Adding & Setting up PID node, Dashboard on Raspberry Pi2

Searching the Nodes

Node-RED comes with a core set of useful nodes, but there are a growing number of additional nodes available for installing from both the Node-RED project as well as the wider community. You can search for available nodes in the Node-RED library or on the npm repository.

  • For example, we are going to search ‘node-red-node-pidcontrol‘ at the npm web. Click here.
  • Then, we are going to install npm package, node-red-node-pidcontrol, node-red-dashboard on Raspberry Pi.

To add additional nodes you must first install the npm tool, as it is not included in the default installation. The following commands install npm and then upgrade it to the latest 2.x version.

sudo apt-get update
sudo apt-get install npm
sudo npm install -g npm@2.x
hash -r
cd /home/pi/.node-red
  • For example, ‘npm install node-red-{example node name}’
  • Copy the ‘npm install node-red-node-pidcontrol’ from the npm web. Paste it on a terminal window.
  • Ex: node-red-dashboard, and node-red-node-pidcontrol
npm install node-red-node-pidcontrol node-red-dashboard
  • You will need to restart Node-RED for it to pick-up the new nodes.
node-red-stop
node-red-start
  • Close your web browser and reopen the web browser.

Step 8: Using a dashboard for PID control

21
22
23
24
25

The dashboard is a visual UI tool like gauge, chart. There is a basic tutorial of a Node-RED dashboard using node-red-dashboard. http://developers.sensetecnic.com/article/a-node-red-dashboard-using-node-red-contrib-ui/

How to use the dashboard on Raspberry Pi.

(1) Click the number (1) of the gauge node.

(2) Set the group property.

(3) Set the properties from (3) to (7) shown from above the picture.

(4) Click the number (8) to go the dashboard window.

(5) Press the number (9) to display the gauge on the web browser.

(6) You can see the gauge which displays ‘40.1’ from the web browser.

(7) The chart to display on the web browser is same as the gauge (1 – 9 steps).

Step 9: Tuning PID controller

26
27
screenshot-2016-11-14-03-23-03
Screenshot 2016-11-14 23.34.39.png
screenshot-2016-11-14-03-20-28
screenshot-2016-11-18-02-00-28
screenshot-2016-11-14-03-11-13
7

There are many tuning methods such as manual tuning, Ziegler–Nichols, Tyreus Luyben, Cohen–Coon, Åström-Hägglund and software tools such as Simulink in Matlab or Excel PID Simulator(enclosed). I’ve used 2 tuning methods like manual tuning, Ziegler-Nichols method and software tools such as Matlab, Simulink, and Excel. (According to Wikipedia: PID Controller)

  • Manual Tuning(Trial and error)

How do the PID parameters affect system dynamics?

We are most interested in four major characteristics of the closed-loop step response. They are

– Rise Time: the time it takes for the plant output y to rise

– Overshoot: how much the peak level is higher than the steady state, normalized against the steady – state.
– Settling Time: the time it takes for the system to converge to its steady state.

– Steady-state Error: the difference between the steady-state output and the desired output.

(NT: No definite trend. Minor change.)

How do we use the table?

Typical steps for designing a PID controller are Determine what characteristics of the system needs to be improved.

– Use KP to decrease the rise time.

– Use KD to reduce the overshoot and settling time.

– Use KI to eliminate the steady-state error.

– This works in many cases, but what would be a good starting point? What if the first parameters we choose are totally crappy? Can we find a good set of initial parameters easily and quickly?

  • Ziegler–Nichols method

– Ziegler and Nichols conducted numerous experiments and proposed rules for determining values of KP, KI, and KD based on the transient step response of a plant.

– They proposed more than one methods, but we will limit ourselves to what’s known as the first method of Ziegler-Nichols in this tutorial. It applies to plants with neither integrators nor dominant complex-conjugate poles, whose unit-step response resemble an S-shaped curve with no overshoot. This S-shaped curve is called the reaction curve. This S-shaped curve is called the reaction curve.

– The S-shaped reaction curve can be characterized by two constants, delay time L and time constant T, which are determined by drawing a tangent line at the inflection point of the curve and finding the intersections of the tangent line with the time axis and the steady-state level line.

The Ziegler-Nichols Tuning Rule Table

Using the parameters L and T, we can set the values of KP, KI, and KDaccording to the formula shown in the table above.

These parameters will typically give you a response with an overshoot about 25% and good settling time. We may then start fine-tuning the controller using the basic rules that relate each parameter to the response characteristics. KP, KI, and KD based on the transient step response of a plant.

  • PID tuning software

– Matlab: PID Controller Tuning

– Simulink: PID Controller Tuning

– Excel PID simulator

– Etc

  • PID control VS On/Off control

On/Off control: An on-off controller is the simplest form of temperature control device. The output from the device is either on or off, with no middle state. An on-off controller will switch the output only when the temperature crosses the setpoint. For heating control, the output is on when the temperature is below the setpoint, and off above setpoint. Since the temperature crosses the setpoint to change the output state, the process temperature will be cycling continually, going from below setpoint to above, and back below. In cases where this cycling occurs rapidly, and to prevent damage to contactors and valves, an on-off differential, or “hysteresis,” is added to the controller operations. This differential requires that the temperature exceeds setpoint by a certain amount before the output will turn off or on again. On-off differential prevents the output from “chattering” or making fast, continual switches if the cycling above and below the setpoint occurs very rapidly. On-off control is usually used where a precise control is not necessary, in systems which cannot handle having the energy turned on and off frequently, where the mass of the system is so great that temperatures change extremely slowly, or for a temperature alarm. One special type of on-off control used for alarm is a limit controller. This controller uses a latching relay, which must be manually reset, and is used to shut down a process when a certain temperature is reached.

PID control: This controller provides proportional with integral and derivative control, or PID. This controller combines proportional control with two additional adjustments, which helps the unit automatically compensate for changes in the system. These adjustments, integral and derivative, are expressed in time-based units; they are also referred to by their reciprocals, RESET, and RATE, respectively. The proportional, integral and derivative terms must be individually adjusted or “tuned” to a particular system using trial and error. It provides the most accurate and stable control of the three controller types, and is best used in systems which have a relatively small mass, those which react quickly to changes in the energy added to the process. It is recommended in systems where the load changes often and the controller is expected to compensate automatically due to frequent changes in setpoint, the amount of energy available, or the mass to be controlled.

Featured

$15 Intel Board for IoT-Developers

[ $15 Intel Board for IoT-developers ]

iotboard_intel

edc-mint-valley-diagram-16x9.png.rendition.intel.web.720.405

Features

  • 8 KB Cache
  • Operating Voltage 2.0V – 3.3V
  • 32 MHz clock speed
  • Optimized power management—low battery power
  • 8 KB SRAM, 32 KB instruction flash, 8 KB OTP flash and 4 KB OTP data flash
  • Scalable Intel® Quark™ Microcontroller Sofware Interface
  • 2 channel DMA
  • Intel® System Studio for Microcontroller SDK

At $15, the Quark Microcontroller Developer Kit D2000 is perhaps the least expensive computer Intel has ever shipped.

The single-board computer has all the components mashed onto a tiny circuit board. It can be used to develop gadgets, wearables, home automation products, industrial equipment and other Internet of Things products.

Developers could also use the computer to hook up sensors for temperature, light, sound, weather and distance to devices.

The developer board is now available from Mouser Electronics. It will also be available from Avnet, according to Intel.

Intel is targeting companies developing IoT devices and the community of do-it-yourself hardware makers with the new board. These boards typically provide a cheap way to prototype electronics or to make fun devices. Intel is following Atmel, SparkFun, and other vendors that develop inexpensive boards.

This board can’t be compared to a high-powered board computer like Raspberry Pi 3, which can double as a PC. The Intel board is smaller, consumes much less power and has a much slower CPU.

Intel has shown examples of how such developer boards can be used. Its Curie board was used on snowboards at X Games to capture and provide real-time information on speed, the height of a jump, and other statistics to viewers and athletes.

Intel has been partnering with well-known products and TV shows to establish its brand recognition with makers, but the core community hasn’t warmed up to the chip maker’s products yet. Developer boards are mostly ARM-based, but the $15 board could provide Intel a breakthrough in the maker community.

The new developer board has the Quark D2000 microcontroller, which operates at a speed of 32MHz, the same frequency as the Quark chip on the button-sized Curie board.

The Intel board has a six-axis accelerometer, a magnetometer with a temperature sensor, and one USB 2.0 port. It also has a coin cell battery slot and a 5-volt power input.

The board is compatible with the hardware specifications of Arduino Uno, a popular software development tool with makers. A development kit called Intel System Studio for Microcontrollers, which is based on the Eclipse integrated development environment, is also included in the kit.

 

Quick Start

Tech Docs

Technical Documents

Datasheets, user guides, design guides, schematics, and more

Sample and Buy

Sample and Buy

Find everything from developer kits and samples to complete solutions from a local distributor

Getting Started on Windows*

Getting Started on Windows*

Step by step instructions to install both the software and hardware for Intel® Quark™ microcontrollers using Windows*

Getting Started on Linux*

Getting Started on Linux*

Step by step instructions to install both the software and hardware for Intel® Quark™ microcontrollers using Linux*

 

Intel® Quark™ Processor Software and Tools

Select the processor for which you need software and tools

 

 

Featured

GPS Experiment With Arduino

This is the post excerpt.

[ GPS Experiment With Arduino ]

Ultimate GPS module, so we named it that. It’s got everything you want and more:

GPSModule.jpg

  • -165 dBm sensitivity, 10 Hz updates, 66 channels
  • 5V friendly design and only 20mA current draw
  • Breadboard friendly + two mounting holes
  • RTC battery-compatible
  • Built-in datalogging
  • PPS output on fix
  • Internal patch antenna + u.FL connector for external active antenna
  • Fix status LED

Arduino Wiring

 

Download the PDF of GPS Module with Arduino Wiring

Once you’ve gotten the GPS module tested with direct wiring, we can go forward and wire it up to a microcontroller. We’ll be using an Arduino but you can adapt our code to any other microcontroller that can receive TTL serial at 9600 baud. Arduino Wiring

Connect VIN to +5V, GND to Ground, RX to digital 2 and TX to digital 3.

gps_softserialwire

Next up, download the Adafruit GPS library. This library does a lot of the ‘heavy lifting’ required for receiving data from GPS modules, such as reading the streaming data in a background interrupt and auto-magically parsing it. To download it, visit the GitHub repository or just click below

rename the uncompressed folder Adafruit_GPS. Check that theAdafruit_GPS folder contains Adafruit_GPS.cpp andAdafruit_GPS.h

Move Adafruit_GPS to your Arduino/Libraries folder and restart the Arduino IDE. Library installation is a frequent stumbling block…if you need assistance, our All About Arduino Libraries guide spells it out in detail!

Leonardo & Micro Users: We have special example sketches in the Adafruit_GPS library that work with the Micro/Leo!

Open up the File→Examples→Adafruit_GPS→echo sketch and upload it to the Arduino. Then open up the serial monitor. This sketch simply reads data from the software serial port (pins 2&3) and outputs that to the hardware serial port connected to USB.

Open up the Arduino IDE Serial Console and make sure to set the Serial baud rate to 115200

You can configure the GPS output you see by commenting/uncommenting lines in the setup() procedure. For example, we can ask the GPS to send different sentences, and change how often it sends data. 10 Hz (10 times a second) is the max speed, and is a lot of data. You may not be able to output “all data” at that speed because the 9600 baud rate is not fast enough.

// You can adjust which sentences to have the module emit, below

// uncomment this line to turn on RMC (recommended minimum) and GGA (fix data) including altitude

GPS.sendCommand(PMTK_SET_NMEA_OUTPUT_RMCGGA);

// uncomment this line to turn on only the "minimum recommended" data for high update rates!

//GPS.sendCommand(PMTK_SET_NMEA_OUTPUT_RMCONLY);

// uncomment this line to turn on all the available data - for 9600 baud you'll want 1 Hz rate

//GPS.sendCommand(PMTK_SET_NMEA_OUTPUT_ALLDATA);

// Set the update rate

// 1 Hz update rate

//GPS.sendCommand(PMTK_SET_NMEA_UPDATE_1HZ);

// 5 Hz update rate- for 9600 baud you'll have to set the output to RMC or RMCGGA only (see above)

GPS.sendCommand(PMTK_SET_NMEA_UPDATE_5HZ);

// 10 Hz update rate - for 9600 baud you'll have to set the output to RMC only (see above)
//GPS.sendCommand(PMTK_SET_NMEA_UPDATE_10HZ);
In general, we find that most projects only need the RMC and GGA NMEA’s so you don’t need ALLDATA unless you have some need to know satellite locations.

gpssch

 

YOLO-Powered_Robot_Vision

YOLO-Powered_Robot_Vision


Introduction

This is a Pi-based robot to implement visual recognition(by YOLO). The YOLO-Powered vision can recognize many objects such as people, car, bus, fruits, and so on.

  • Hardware: Raspberry-Pi2, Sony PS3 Eye Camera

    (Available to use Logitech C270 USB camera with Raspberry Pi)

  • Software: YOLO(v2), Jupyter-Notebook

Structure.png

My motivation

I was so interested in performance of the image recognition with YOLO-2 on Raspberry Pi. In addition, the Jupyter notebook is really convenient to instantly code as a quick prototype. According to paper, I realised that YOLO is a fast, accurate visual detector, making it ideal for computer vision system. We connect YOLO to a webcam and verify that it maintains real-time performance. So, the Raspberry pi’s processing speed is very slow compare to my laptop.

(Picasso Dataset precision-recall curves: paper)

Perfomance_Picaso.png

(The Architecture: paper)

Architecture_CNN.png

Requirements and Installation

Quick Start

This post will guide you through detecting objects with the YOLO system using a pre-trained model. If you don’t already have Darknet installed, you should install OpenCV2 before on your Raspberry Pi.

  • Install dependencies for OpenCV2
sudo apt-get update

sudo apt-get install build-essential
sudo apt-get install cmake git libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev python-dev python-numpy libjpeg-dev libpng-dev libtiff-dev libjasper-dev

sudo apt-get install python-opencv
  • Check which version of OpenCV you have in Python
python
import cv2
cv2.__version__
  • Install the darknet for YOLO
git clone https://github.com/pjreddie/darknet
cd darknet
make

Easy!

You already have the config file for YOLO in the cfg/ subdirectory. You will have to download the pre-trained weight file here (258 MB). Or just run this:

wget http://pjreddie.com/media/files/yolo.weights

Then run the detector to test.

./darknet detect cfg/yolo.cfg yolo.weights data/dog.jpg

You will see some output like this:

layer     filters    size              input                output
    0 conv     32  3 x 3 / 1   416 x 416 x   3   ->   416 x 416 x  32
    1 max          2 x 2 / 2   416 x 416 x  32   ->   208 x 208 x  32
    .......
   29 conv    425  1 x 1 / 1    13 x  13 x1024   ->    13 x  13 x 425
   30 detection
Loading weights from yolo.weights...Done!
data/dog.jpg: Predicted in 0.016287 seconds.
car: 54%
bicycle: 51%
dog: 56%

Output

  • Re-name from ‘darknet’ to ‘YOLO-Powered_Robot_Vision’.
mv /home/pi/Documents/darknet /home/pi/Documents/YOLO-Powered_Robot_Vision
cd /home/pi/Documents/YOLO-Powered_Robot_Vision
  • Download ‘YOLO-Powered_Robot_Vision.ipynb’ at /home/pi/Documents/YOLO-Powered_Robot_Vision
wget https://github.com/leehaesung/YOLO-Powered_Robot_Vision/raw/master/YOLO-Powered_Robot_Vision.ipynb

jupyter-notebook

Source Codes

Result of Object Recognition

(Caution!!: I have used the image sources for an educational purpose. Please don’t use any pictures in copyright. So, I am not responsible for any images you use.)

predictions06.png

  • Another examples predictions07.png

predictions08.png

predictions09.png

predictions11.png

Videos

  1. Deep Learning and Neural Networks with Kevin Duh: course page
  2. NY Course by Yann LeCun: 2014 version, 2015 version
  3. NIPS 2015 Deep Learning Tutorial by Yann LeCun and Yoshua Bengio (slides)(mp4,wmv)
  4. ICML 2013 Deep Learning Tutorial by Yann Lecun (slides)
  5. Geoffery Hinton’s cousera course on Neural Networks for Machine Learning
  6. Stanford 231n Class: Convolutional Neural Networks for Visual Recognition (videosgithub, syllabus, subreddit, projectfinal reports, twitter)
  7. Large Scale Visual Recognition Challenge 2014, arxiv paper
  8. GTC Deep Learning 2015
  9. Hugo Larochelle Neural Networks class, slides
  10. My youtube playlist
  11. Yaser Abu-Mostafa’s Learning from Data course (youtube playlist)
  12. Stanford CS224d: Deep Learning for Natural Language Processing: syllabus, youtube playlist, reddit, longer playlist
  13. Neural Networks for Machine Perception: vimeo
  14. Deep Learning for NLP (without magic): page, better page, video1, video2, youtube playlist
  15. Introduction to Deep Learning with Python: video, slides, code
  16. Machine Learning course with emphasis on Deep Learning by Nando de Freitas (youtube playlist), course page, torch practicals
  17. NIPS 2013 Deep Learning for Computer Vision Tutorial – Rob Fergus: video, slides
  18. Tensorflow Udacity mooc
  19. Oxford Deep NLP Course 2017 (github)

Links

  1. Deeplearning.net
  2. NVidia’s Deep Learning portal
  3. My flipboard page

AMIs, Docker images & Install Howtos

  1. Stanford 231n AWS AMI:  image is cs231n_caffe_torch7_keras_lasagne_v2, AMI ID: ami-125b2c72, Caffe, Torch7, Theano, Keras and Lasagne are pre-installed. Python bindings of caffe are available. It has CUDA 7.5 and CuDNN v3.
  2. AMI for AWS EC2 (g2.2xlarge): ubuntu14.04-mkl-cuda-dl (ami-03e67874) in Ireland Region: page,  Installed stuffs: Intel MKL, CUDA 7.0, cuDNN v2, theano, pylearn2, CXXNET, Caffe, cuda-convnet2, OverFeat, nnForge, Graphlab Create (GPU), etc.
  3. Chef cookbook for installing the Caffe deep learning framework
  4. Public EC2 AMI with Torch and Caffe deep learning toolkits (ami-027a4e6a): page
  5. Install Theano on AWS (ami-b141a2f5 with CUDA 7): page
  6. Running Caffe on AWS Instance via Docker: page, docs, image
  7. CVPR 2015 ITorch Tutorial (ami-b36981d8): page, github, cheatsheet
  8. Torch/iTorch/Ubuntu 14.04 Docker image: docker pull kaixhin/torch
  9. Torch/iTorch/CUDA 7/Ubuntu 14.04 Docker image: docker pull kaixhin/cuda-torch
  10. AMI containing Caffe, Python, Cuda 7, CuDNN, and all dependencies. Its id is ami-763a311e (disk min 8G,system is 4.6G), howto
  11. My Dockerfiles at GitHub

Examples and Tutorials

  1. IPython Caffe Classification
  2. IPython Detection, arxiv paper, rcnn github, selective search
  3. Machine Learning with Torch 7
  4. Deep Learning Tutorials with Theano/Python, CNN, github
  5. Torch tutorials, tutorial&demos from Clement Fabaret
  6. Brewing Imagenet with Caffe
  7. Training an Object Classifier in Torch-7 on multiple GPUs over ImageNet
  8. Stanford Deep Learning Matlab based Tutorial (github, data)
  9. DIY Deep Learning for Vision: A Hands on tutorial with Caffe (google doc)
  10. Tutorial on Deep Learning for Vision CVPR 2014: page
  11. Pylearn2 tutorials: convolutional network, getthedata
  12. Pylearn2 quickstart, docs
  13. So you wanna try deep learning? post from SnippyHollow
  14. Object Detection ipython nb from SnippyHollow
  15. Filter Visualization ipython nb from SnippyHollow
  16. Specifics on CNN and DBN, and more
  17. CVPR 2015 Caffe Tutorial
  18. Deep Learning on Amazon EC2 GPU with Python and nolearn
  19. How to build and run your first deep learning network (video, behind paywall)
  20. Tensorflow examples
  21. Illia Polosukhin’s Getting Started with Tensorflow – Part 1, Part 2, Part 3
  22. CNTK Tutorial at NIPS 2015
  23. CNTK: FFN, CNN, LSTM, RNN
  24. CNTK Introduction and Book

People

  1. Geoffery Hinton: Homepage, Reddit AMA (11/10/2014)
  2. Yann LeCun: Homepage, NYU Research Page, Reddit AMA (5/15/2014)
  3. Yoshua Bengio: Homepage, Reddit AMA (2/27/2014)
  4. Clement Fabaret: Scene Parsing (paper), github, code page
  5. Andrej Karpathy: Homepagetwitter, github, blog
  6. Michael I Jordan: Homepage, Reddit AMA (9/10/2014)
  7. Andrew Ng: Homepage, Reddit AMA (4/15/2015)
  8. Jurden Schmidhuber: Homepage, Reddit AMA (3/4/2015)
  9. Nando de Freitas: Homepage, YouTube, Reddit AMA (12/26/2015)

Datasets

  1. ImageNet
  2. MNIST (Wikipedia), database
  3. Kaggle datasets
  4. Kitti Vision Benchmark Suite
  5. Ford Campus Vision and Lidar Dataset
  6. PCL Lidar Datasets
  7. Pylearn2 list

Frameworks and Libraries

  1. Caffe: homepage, github, google group
  2. Torch: homepage, cheatsheet, github, google group
  3. Theano: homepage, google group
  4. Tensorflow: homepage, github, google group, skflow
  5. CNTK: homepage, github, wiki
  6. CuDNN: homepage
  7. PaddlePaddle: homepage, github, docs, quick start
  8. fbcunn: github
  9. pylearn2: github, docs
  10. cuda-convnet2: homepage, cuda-convnet, matlab
  11. nnForge: homepage
  12. Deep Learning software links
  13. Torch vs. Theano post
  14. Overfeat: page, github, paper, slidesgoogle group
  15. Keras: github, docs, google group
  16. Deeplearning4j: page, github
  17. Lasagne: docs, github

Topics

  1. Scene Understanding (CVPR 2013, Lecun) (slides), Scene Parsing (paper)
  2. Overfeat: Integrated Recognition, Localization and Detection using Convolutional Networks (arxiv)
  3. Parsing Natural Scenes and Natural Language with Recursive Neural Networks: page, ICML 2011 paper

Reddit

  1. Machine Learning Reddit page
  2. Computer Vision Reddit page
  3. Reddit: Neural Networks: new, relevant
  4. Reddit: Deep Learning: new, relevant

Books

  1. Learning Deep Architectures for AI, Bengio (pdf)
  2. Neural Nets and Deep Learning (html, github)
  3. Deep Learning, Bengio, Goodfellow, Courville (html)
  4. Neural Nets and Learning Machines, Haykin, 2008 (amazon)

Papers

  1. ImageNet Classification with Deep Convolutional Neural Networks, Alex Krizhevsky, Ilya Sutskever, Geoffrey E Hinton, NIPS 2012 (paper)
  2. Why does unsupervised pre-training help deep learning? (paper)
  3. Hinton06 – Autoencoders (paper)
  4. Deep Learning using Linear Support Vector machines (paper)

Companies

  1. Kaggle: homepage
  2. Microsoft Deep Learning Technology Center

Conferences

  1. ICML
  2. PAMITC Sponsored Conferences
  3. NIPS: 2015

Installing & Testing Google TensorFlow on Raspberry Pi2

[ Installing & Testing Google TensorFlow on Raspberry Pi2 ]

Let’s install TensorFlow.

sudo apt-get update
# For Python 3.3+
sudo apt-get install python3-pip python3-dev

# For Python 3.3+
wget https://github.com/samjabrahams/tensorflow-on-raspberry-pi/releases/download/v1.0.1/tensorflow-1.0.1-cp34-cp34m-linux_armv7l.whl
sudo pip3 install tensorflow-1.0.1-cp34-cp34m-linux_armv7l.whl

# For Python 3.3+
sudo pip3 uninstall mock
sudo pip3 install mock

And then,
Let’s code it below that

Ref.: https://www.tensorflow.org/get_started/

 

pi@raspberrypi:~ $ python3
Python 3.4.2 (default, Oct 19 2014, 13:31:11)
[GCC 4.9.1] on linux
Type "help", "copyright", "credits" or "license" for more information.
import tensorflow as tf
import numpy as np

# Create 100 phony x, y data points in NumPy, y = x * 0.1 + 0.3
x_data = np.random.rand(100).astype(np.float32)
y_data = x_data * 0.1 + 0.3
W = tf.Variable(tf.random_uniform([1], -1.0, 1.0))
b = tf.Variable(tf.zeros([1]))
y = W * x_data + b
loss = tf.reduce_mean(tf.square(y - y_data))
optimizer = tf.train.GradientDescentOptimizer(0.5)
train = optimizer.minimize(loss)
init = tf.global_variables_initializer()
sess = tf.Session()
sess.run(init)
for step in range(201):
     sess.run(train)
     if step % 20 == 0:
         print(step, sess.run(W), sess.run(b))


0 [ 0.2893075] [ 0.26960531]
20 [ 0.14367677] [ 0.27712572]
40 [ 0.11184501] [ 0.29379657]
60 [ 0.10321232] [ 0.29831767]
80 [ 0.10087116] [ 0.29954377]
100 [ 0.10023624] [ 0.29987627]
120 [ 0.10006408] [ 0.29996645]
140 [ 0.10001738] [ 0.29999092]
160 [ 0.1000047] [ 0.29999754]
180 [ 0.10000128] [ 0.29999936]
200 [ 0.10000037] [ 0.29999983]

screenshot-2017-01-28-22-17-32

Deep Learning Libraries

Software links

  1. Theano – CPU/GPU symbolic expression compiler in python (from MILA lab at University of Montreal)
  2. Torch – provides a Matlab-like environment for state-of-the-art machine learning algorithms in lua (from Ronan Collobert, Clement Farabet and Koray Kavukcuoglu)
  3. Pylearn2 – Pylearn2 is a library designed to make machine learning research easy.
  4. Blocks – A Theano framework for training neural networks
  5. Tensorflow – TensorFlow™ is an open source software library for numerical computation using data flow graphs.
  6. MXNet – MXNet is a deep learning framework designed for both efficiency and flexibility.
  7. Caffe -Caffe is a deep learning framework made with expression, speed, and modularity in mind.Caffe is a deep learning framework made with expression, speed, and modularity in mind.
  8. Lasagne – Lasagne is a lightweight library to build and train neural networks in Theano.
  9. Keras– A theano based deep learning library.
  10. Deep Learning Tutorials – examples of how to do Deep Learning with Theano (from LISA lab at University of Montreal)
  11. Chainer – A GPU based Neural Network Framework
  12. Matlab Deep Learning – Matlab Deep Learning Tools
  13. CNTK – Computational Network Toolkit – is a unified deep-learning toolkit by Microsoft Research.
  14. MatConvNet – A MATLAB toolbox implementing Convolutional Neural Networks (CNNs) for computer vision applications. It is simple, efficient, and can run and learn state-of-the-art CNNs.
  15. DeepLearnToolbox – A Matlab toolbox for Deep Learning (from Rasmus Berg Palm)
  16. Cuda-Convnet – A fast C++/CUDA implementation of convolutional (or more generally, feed-forward) neural networks. It can model arbitrary layer connectivity and network depth. Any directed acyclic graph of layers will do. Training is done using the back-propagation algorithm.
  17. Deep Belief Networks. Matlab code for learning Deep Belief Networks (from Ruslan Salakhutdinov).
  18. RNNLM– Tomas Mikolov’s Recurrent Neural Network based Language models Toolkit.
  19. RNNLIB-RNNLIB is a recurrent neural network library for sequence learning problems. Applicable to most types of spatiotemporal data, it has proven particularly effective for speech and handwriting recognition.
  20. matrbm. Simplified version of Ruslan Salakhutdinov’s code, by Andrej Karpathy (Matlab).
  21. deeplearning4j– Deeplearning4J is an Apache 2.0-licensed, open-source, distributed neural net library written in Java and Scala.
  22. Estimating Partition Functions of RBM’s. Matlab code for estimating partition functions of Restricted Boltzmann Machines using Annealed Importance Sampling (from Ruslan Salakhutdinov).
  23. Learning Deep Boltzmann Machines Matlab code for training and fine-tuning Deep Boltzmann Machines (from Ruslan Salakhutdinov).
  24. The LUSH programming language and development environment, which is used @ NYU for deep convolutional networks
  25. Eblearn.lsh is a LUSH-based machine learning library for doing Energy-Based Learning. It includes code for “Predictive Sparse Decomposition” and other sparse auto-encoder methods for unsupervised learning. Koray Kavukcuoglu provides Eblearn code for several deep learning papers on this page.
  26. deepmat– Deepmat, Matlab based deep learning algorithms.
  27. MShadow – MShadow is a lightweight CPU/GPU Matrix/Tensor Template Library in C++/CUDA. The goal of mshadow is to support efficient, device invariant and simple tensor library for machine learning project that aims for both simplicity and performance. Supports CPU/GPU/Multi-GPU and distributed system.
  28. CXXNET – CXXNET is fast, concise, distributed deep learning framework based on MShadow. It is a lightweight and easy extensible C++/CUDA neural network toolkit with friendly Python/Matlab interface for training and prediction.
  29. Nengo-Nengo is a graphical and scripting based software package for simulating large-scale neural systems.
  30. Eblearn is a C++ machine learning library with a BSD license for energy-based learning, convolutional networks, vision/recognition applications, etc. EBLearn is primarily maintained by Pierre Sermanet at NYU.
  31. cudamat is a GPU-based matrix library for Python. Example code for training Neural Networks and Restricted Boltzmann Machines is included.
  32. Gnumpy is a Python module that interfaces in a way almost identical to numpy, but does its computations on your computer’s GPU. It runs on top of cudamat.
  33. The CUV Library (github link) is a C++ framework with python bindings for easy use of Nvidia CUDA functions on matrices. It contains an RBM implementation, as well as annealed importance sampling code and code to calculate the partition function exactly (from AIS lab at University of Bonn).
  34. 3-way factored RBM and mcRBM is python code calling CUDAMat to train models of natural images (from Marc’Aurelio Ranzato).
  35. Matlab code for training conditional RBMs/DBNs and factored conditional RBMs (from Graham Taylor).
  36. mPoT is python code using CUDAMat and gnumpy to train models of natural images (from Marc’Aurelio Ranzato).
  37. neuralnetworks is a java based gpu library for deep learning algorithms.
  38. ConvNet is a matlab based convolutional neural network toolbox.
  39. Elektronn is a deep learning toolkit that makes powerful neural networks accessible to scientists outside the machine learning community.
  40. OpenNN is an open source class library written in C++ programming language which implements neural networks, a main area of deep learning research.
  41. NeuralDesigner  is an innovative deep learning tool for predictive analytics.
  42. Theano Generalized Hebbian Learning.
  43. Apache Singa is an open source deep learning library that provides a flexible architecture for scalable distributed training. It is extensible to run over a wide range of hardware, and has a focus on health-care applications.
  44. Lightnet  is a lightweight, versatile and purely Matlab-based deep learning framework. The aim of the design is to provide an easy-to-understand, easy-to-use and efficient computational platform for deep learning research.

If your software belongs here, email us and let us know.

 

NodeRED BlockChain

Build your own block chain in 15 minutes on Node-RED using Node.js, JavaScript, Cloudant/CouchDB on a free IBM Cloud account… Note: To do the tutorial you need a free Bluemix (IBM PaaS Cloud) account. You can obtain one here and the raw file (JSON) for this NodeRED flow is here. Tutorial Objective In this exercise […]

via Node-RED Blockchain — romeokienzler

Installing Cylon.js for the Raspberry Pi

[ Installing Cylon.js for the Raspberry Pi ]

 

Repository| Issues

The Raspberry Pi is an inexpensive and popular ARM based single board computer with digital & PWM GPIO, and i2c interfaces built in.

The Raspberry Pi is a credit-card-sized single-board computer developed in the UK by the Raspberry Pi Foundation with the intention of promoting the teaching of basic computer science in schools

For more info about the Raspberry Pi platform, click here.

How to Install

Installing Cylon.js for the Raspberry Pi is easy, but must be done on the Raspi itself, or on another Linux computer. Due to I2C device support, the module cannot be installed on OS X or Windows.

Install the module with:

$ npm install cylon cylon-raspi

How to Use

This small program causes an LED to blink.

var Cylon = require("cylon");

Cylon.robot({
  connections: {
    raspi: { adaptor: 'raspi' }
  },

  devices: {
    led: { driver: 'led', pin: 11 }
  },

  work: function(my) {
    every((1).second(), my.led.toggle);
  }
}).start();

How to Connect

Install the lastest Raspbian OS

You can get it from here: http://www.raspberrypi.org/downloads/

Setting the Raspberry Pi keyboard

Having trouble with your Raspberry Pi keyboard layout? Use the following command:

sudo dpkg-reconfigure keyboard-configuration

Update your Raspbian and install Node.js

These commands need to be run after SSHing into the Raspi:

sudo apt-get update
sudo apt-get upgrade
wget http://nodejs.org/dist/v0.10.28/node-v0.10.28-linux-arm-pi.tar.gz
tar -xvzf node-v0.10.28-linux-arm-pi.tar.gz
node-v0.10.28-linux-arm-pi/bin/node --version

You should see the node version you just installed.

$ node --version
v0.10.28

Once you have installed Node.js, you need to add the following to your ~/.bash_profile file. Create this file if it does not already exist, and add this to it:

NODE_JS_HOME=/home/pi/node-v0.10.28-linux-arm-pi
PATH=$PATH:$NODE_JS_HOME/bin

This will setup the path for you every time you login. Run the source ~/.bash_profile command to load it right now without having to login again.

Thanks @joshmarinacci for the blog post at http://joshondesign.com/2013/10/23/noderpi where these modified instructions were taken.

Connecting to Raspberry Pi GPIO

This module only works on a real Raspberry Pi. Do not bother trying on any other kind of computer, it will not work. Also note you will need to connect actual circuits to the Raspberry Pi’s GPIO pins.

In order to access the GPIO pins without using sudo you will need to both app the pi user to the gpio group:

sudo usermod -G gpio pi

And also add the following udev rules file to /etc/udev/rules.d/91-gpio.rules:

SUBSYSTEM=="gpio", KERNEL=="gpiochip*", ACTION=="add", PROGRAM="/bin/sh -c 'chown root:gpio /sys/class/gpio/export /sys/class/gpio/unexport ; chmod 220 /sys/class/gpio/export /sys/class/gpio/unexport'"
SUBSYSTEM=="gpio", KERNEL=="gpio*", ACTION=="add", PROGRAM="/bin/sh -c 'chown root:gpio /sys%p/active_low /sys%p/direction /sys%p/edge /sys%p/value ; chmod 660 /sys%p/active_low /sys%p/direction /sys%p/edge /sys%p/value'"

Thanks to “MikeDK” for the above solution: https://www.raspberrypi.org/forums/viewtopic.php?p=198148#p198148

Enabling the Raspberry Pi i2c on Raspbian

You must add these two entries to your /etc/modules

i2c-bcm2708
i2c-dev

You must also ensure that these entries are commented in your /etc/modprobe.d/raspi-blacklist.conf

#blacklist spi-bcm2708
#blacklist i2c-bcm2708

You will also need to update the /boot/config.txt file. Edit it add the following text:

dtparam=i2c1=on
dtparam=i2c_arm=on

Finally, you need to allow the pi user permissions to access the i2c interface by running this command:

sudo usermod -G i2c pi

Now restart your Raspberry Pi.

Enabling PWM output on GPIO pins.

You need to install and have pi-blaster running in the raspberry-pi, you can follow the instructions for pi-blaster install in the pi-blaster repo here:

https://github.com/sarfata/pi-blaster

Available PINS

The following object depicts available pins for all revisions of raspberry-pi, the key is the actual number of the physical pin header on the board,the value is the GPIO pin number assigned by the OS, for the pins with changes between board revisions, the value contains the variations of GPIO pin number assignment between them (eg.rev1, rev2, rev3).

You should just be concerned with the key (number of the physical pin header on the board), Cylon.JS takes care of the board revision and GPIO pin numbers for you, this full list is for reference only.

PINS = {
  3: {
    rev1: 0,
    rev2: 2,
    rev3: 2
  },
  5: {
    rev1: 1,
    rev2: 3,
    rev3: 3
  },
  7: 4,
  8: 14,
  10: 15,
  11: 17,
  12: 18,
  13: {
    rev1: 21,
    rev2: 27,
    rev3: 27
  },
  15: 22,
  16: 23,
  18: 24,
  19: 10,
  21: 9,
  22: 25,
  23: 11,
  24: 8,
  29: {
    rev3: 5
  },
  31: {
    rev3: 6
  },
  32: {
    rev3: 12
  },
  33: {
    rev3: 13
  },
  35: {
    rev3: 19
  },
  36: {
    rev3: 16
  },
  37: {
    rev3: 26
  },
  38: {
    rev3: 20
  },
  40: {
    rev3: 21
  }
};

The website http://pi.gadgetoid.com/pinout has a great visual representation of this information.

Drivers

All Cylon.js digital and PWM GPIO drivers listed below should work with the Raspberry Pi:

I2C Drivers