Mbed OS on Arduino Nano 33 BLE

Arduino release several new boards including the Nano 33 BLE which pick my interest due to the fact that it is based on the Nrf52480 chip ( https://www.nordicsemi.com/Products/Low-power-short-range-wireless/nRF52840) which should be compatible with Bluetoot MESH. I decided to give it a try…

This is the first board based on the Nrf52840 and thus there is no existing Arduino core for this board which means they would face some challenge to have Arduino running on it. They went for a smart solution by deciding to run Arduino Core on top of MBED OS to be able to use all the works already done by MBEDOS to integrate the nrf52840 chip. This is explain on their blog if you are curious: https://blog.arduino.cc/2019/07/31/why-we-chose-to-build-the-arduino-nano-33-ble-core-on-mbed-os/

That’s when I discover about MBEDOs ” Arm Mbed OS is a free, open-source embedded operating system designed specifically for the “things” in the Internet of Things. ” https://www.mbed.com/en/platform/mbed-os/ which seems promising especialy since it has a very complet Bluetooth stack Cordio (https://os.mbed.com/docs/mbed-cordio/19.02/introduction/index.html) which include Bluetooth MESH capability (https://community.arm.com/iot/b/blog/posts/bluetooth-mesh-with-bluetooth-5-and-cordio-radio-ip) which is not offer today by the Arduino Bluetooth library (https://github.com/arduino-libraries/ArduinoBLE/issues/22). That’s why i decided to give a try to program with MBEDos directly instead of Arduino. Nevertheless my first idea was to keep the arduino bootloader on the board so that I can program it WITHOUT any extra device and benefits from arduino bootloader.

Code

I used MBED studio to avoid dealing with the setup of a toolchain on windows: https://os.mbed.com/studio/

The only “difficulty” is to use a mbed version recent enough (>=5.14) to have the adruino nano 33 in the list of target.

I ve done a very simple program that will blink 3 times a LED on pin

#include "mbed.h"
#include "ble/BLE.h"

DigitalOut led1(p47);

// main() runs in its own thread in the OS
int main()
{
    //Pattern is 3 blink and wait 
    while (true) {
        // Turn led on
        led1 = 1;
        wait_ms(200);
        led1 = 0;
        wait_ms(200);
        led1 = 1;
        wait_ms(200);
        led1 = 0;
        wait_ms(200);
        led1 = 1;
        wait_ms(200);
        led1 = 0;
        wait_ms(3000);
    }
} 

Also in the folder of your project you should have a /mbed-os/targets/targets.json with the NANO33 definition.

I added 2 parameters “”OUTPUT_EXT”: “bin”” to generate a bin file instead of an hex file (the uploader use bin files)

The second change is “”mbed_app_start”: “0x10000″” so that the toolchain know my program will be located in memory at address 0x10000. This info is part of the arduino bootlaoder code and was explain to me here: https://github.com/arduino/ArduinoCore-nRF528x-mbedos/issues/19 The arduino bootloader will call the code at this address after it boot.

After you compile the code with MBED studio you will have the result binary here “C:\Users\charl\Mbed Programs\mbed-os-example-blinky\BUILD\ARDUINO_NANO33BLE\ARMC6” that can be uploaded to the board

upload

This part is tricky since my original goal is to use the arduino bootloader to upload my code to the board without any other tools. It took me some time to understand how to do it but hopefuly i got some help here https://github.com/arduino/ArduinoCore-nRF528x-mbedos/issues/19 and so i decided to use BOSSAC which is the same tools used by the arduino IDE. You can see it in the logs if you activate the extra logs in the IDE.

I re-use the bossac binary that is shipped with the arduino IDE and added an option to upload at address 0x10000. You need to press twice the reset button on the board to force the bootloader to run forever before starting the upload command.

C:\Users\charl\AppData\Local\Arduino15\packages\arduino\tools\bossac\1.9.1-arduino1> .\bossac.exe -d --port=COM5 -o 0x10000 -U -i -e -w 'C:\Users\charl\Mbed Programs\mbed-os-example-blinky\BUILD\ARDUINO_NANO33BLE\ARMC6\mbed-os-example-blinky.bin' -R                                                    Set binary mode                                                                                                                                         version()=Arduino Bootloader (SAM-BA extended) 2.0 [Arduino:IKXYZ]
 Connected at 921600 baud
 identifyChip()=nRF52840-QIAA
 write(addr=0,size=0x34)
 writeWord(addr=0x30,value=0x400)
 writeWord(addr=0x20,value=0)
 version()=Arduino Bootloader (SAM-BA extended) 2.0 [Arduino:IKXYZ]
 Device       : nRF52840-QIAA
 Version      : Arduino Bootloader (SAM-BA extended) 2.0 [Arduino:IKXYZ]
 Address      : 0x0
 Pages        : 256
 Page Size    : 4096 bytes
 Total Size   : 1024KB
 Planes       : 1
 Lock Regions : 0
 Locked       : none
 Security     : false
 Erase flash
 chipErase(addr=0x10000)
 Done in 0.003 seconds                                                                                                                                   Write 43884 bytes to flash (11 pages)                                                                                                                   [                              ] 0% (0/11 pages)write(addr=0x34,size=0x1000)                                                                            writeBuffer(scr_addr=0x34, dst_addr=0x10000, size=0x1000)                                                                                               [==                            ] 9% (1/11 pages)write(addr=0x34,size=0x1000)                                                                            writeBuffer(scr_addr=0x34, dst_addr=0x11000, size=0x1000)                                                                                               [=====                         ] 18% (2/11 pages)write(addr=0x34,size=0x1000)                                                                           writeBuffer(scr_addr=0x34, dst_addr=0x12000, size=0x1000)                                                                                               [========                      ] 27% (3/11 pages)write(addr=0x34,size=0x1000)                                                                           writeBuffer(scr_addr=0x34, dst_addr=0x13000, size=0x1000)                                                                                               [==========                    ] 36% (4/11 pages)write(addr=0x34,size=0x1000)                                                                           writeBuffer(scr_addr=0x34, dst_addr=0x14000, size=0x1000)                                                                                               [=============                 ] 45% (5/11 pages)write(addr=0x34,size=0x1000)                                                                           writeBuffer(scr_addr=0x34, dst_addr=0x15000, size=0x1000)                                                                                               [================              ] 54% (6/11 pages)write(addr=0x34,size=0x1000)                                                                           writeBuffer(scr_addr=0x34, dst_addr=0x16000, size=0x1000)                                                                                               [===================           ] 63% (7/11 pages)write(addr=0x34,size=0x1000)                                                                           writeBuffer(scr_addr=0x34, dst_addr=0x17000, size=0x1000)                                                                                               [=====================         ] 72% (8/11 pages)write(addr=0x34,size=0x1000)                                                                           writeBuffer(scr_addr=0x34, dst_addr=0x18000, size=0x1000)                                                                                               [========================      ] 81% (9/11 pages)write(addr=0x34,size=0x1000)                                                                           writeBuffer(scr_addr=0x34, dst_addr=0x19000, size=0x1000)                                                                                               [===========================   ] 90% (10/11 pages)write(addr=0x34,size=0x1000)                                                                          writeBuffer(scr_addr=0x34, dst_addr=0x1a000, size=0x1000)                                                                                               [==============================] 100% (11/11 pages)                                                                                                     Done in 2.076 seconds                                                                                                                                   reset()                                                                                                                                                 PS C:\Users\charl\AppData\Local\Arduino15\packages\arduino\tools\bossac\1.9.1-arduino1>

Seems to be working… nevertheless the LED do not blink 🙁

In the meantime I bought a programmer for the chip since the manufacturer has a “education/hobbyist” version for only 20$… https://www.segger.com/products/debug-probes/j-link/models/j-link-edu-mini/ I was not aware that i could had a official programmer for such low price….which made my original idea to reuse the Arduino bootloader to upload to avoid buying one useless…

Now that I have my hand on a programmer i decided to try to understand why my program was not running….is it my code or the upload failed or something else. I modify my code to remove the offset and start at address 0x0 and I upload it directly on the chip and the LED start blinking. The code is thus good and the issue seems located in the upload process… I decided to do another test and put back my offset 0x10000 in my program, erase the board, upload the Arduino bootloader in the board, and finally upload my program at address 0x10000 but not with BOSSAC but using the J-Link tool. The LED start blinking !

I compare the memory at address 0x10000 for both cases. First the test where i upload with BOSSAC at address 0x10000 and my program do not works

and when I upload with J-Link

It seems the upload with BOSSAC do not works which seems to confirms what i thought.

Conclusion

I stop here since I now have a programmer to directly program the chip and so I can move on with my Bluetooth MESH tests.

Thx to https://github.com/facchinm for helping me understand more about Arduino bootloader and Thx to https://www.segger.com/ for selling a powerful programmer for hobbyist at a very reasonable price.

If you want to try MBED OS you should take a board compatible offering a DAP-Link for USB upload “Arm Mbed DAPLink is an open-source software project that enables programming and debugging application software on running on Arm Cortex CPUs. Commonly referred to as interface firmware, DAPLink runs on a secondary MCU that is attached to the SWD or JTAG port of the application MCU. This configuration is found on nearly all development boards. It creates a bridge between your development computer and the CPU debug access port. DAPLink enables developers with drag-and-drop programming, a serial port and CMSIS-DAP based debugging.” I would suggest this one for the Nrf52840 chip >> https://os.mbed.com/platforms/Nordic-nRF52840-DK/

Burn bootloader Arduino nano 33 BLE

Quick post to explain how to upload the original Arduino nano 33 BLE bootloader in case you erased it by mistake (another story for another post)…

Hardware

I have a J-Link EDU Mini as external programmer for the chip of the board (Nrf52840 – https://www.nordicsemi.com/Products/Low-power-short-range-wireless/nRF52840) It s a very cheap programmer (15$) that can only be used for non commercial application but offer the same functionality that a 400$ programmer. Thx Nordic for offering that to hobbyist ! More info about it here: https://www.segger.com/products/debug-probes/j-link/models/j-link-edu-mini/
I would also suggest to buy an adapter for it “designed to make it easier to use ARM dev boards that use slimmer 2×5 (0.05″/1.27mm pitch) SWD cables for programming” like this one: https://www.adafruit.com/product/2743

Connection

There are 5 cables to connect between the Arduino nano 33 BLE and the J-Link programmer but first let’s locate the Probe on the Nano 33 BLE (I used another nano 33 BLE sense board in pic but it is the same as the Nano 33 BLE).

The pins are:
1 – Vref (3.3V in our case)
2 – SWDIO (data)
3 – SWCLK (clock)
4 – Ground
5 – Reset

The tricky part will be to solder the cable because all the points are small and close to each other (I wish they expose them with a more practical way but we are lucky to have access to them already;) )

The mapping of the J-Link is explain on the official doc https://www.segger.com/products/debug-probes/j-link/models/j-link-edu-mini/ and it s even better if you use the adapter since the name will be written of it

The mapping is thus

Nano 33 BLEJ-Link EDU
Vref Vref
SWDIOSWIO
SWCLKCLK
GroundGND (any)
ResetRST

Software

You need to full J-link package (which is free) that you can download on Nordic website here: https://www.segger.com/downloads/jlink/ We are going to use J-Flash lite to be specific but better install the whole package 😉

Turn on the 2 board (don t forget to plug tha nano 33 or Vref will be 0V and J-Link will fails to connect to the Arduino board).

We need to locate the Arduino Bootloader of the nano 33 BLE file that we are going to upload with J-Flash lite….. It is located here C:\Users\charl\AppData\Local\Arduino15\packages\arduino\hardware\mbed\1.1.2\bootloaders\nano33ble (of course you need to adapt) and contains 3 files

We are going to use the bin one (should works with hex one too) and give this path in J-Flasher. Be sure to choose nrf52840 in the list of device (the rest of parameters can stay with their default value) and then click “Program Device”.

You should had some logs similar to these one and then the board should have back its original Arduino bootloader and should be usable again in the Arduino UI 😉

DEFCON scale

Let’s start with an explanation of what is the DEFCON scale from https://en.wikipedia.org/wiki/DEFCON:

The DEFCON system was developed by the Joint Chiefs of Staff (JCS) and unified and specified combatant commands.[2] It prescribes five graduated levels of readiness (or states of alert) for the U.S. military. It increases in severity from DEFCON 5 (least severe) to DEFCON 1 (most severe) to match varying military situations.

https://en.wikipedia.org/wiki/DEFCON

and a picture of the final result to better understand how it look like

Hardware

The scale is made in wood with 9 LEDs behind each level/number

The logic of the scale comes from an arduino Micro with a BLE sparkfun BLE module “SparkFun Bluetooth Mate Silver – https://www.sparkfun.com/products/12576 ” (which is not used for now). The Arduino is powered by 9V regulated from the 12V main power (used for LED). Each number back light is control by a MOSFET (P30N06LE) driven by the Arduino. There are also 2 buttons for tests to increase/decrease the level.

Software

Arduino

The arduino micro communicate with a computer using USB to receive the level it should set on the scale. It will do that buy driving 5 MOSFET to light the proper panel. It also listen to press on 2 buttons to raise/decrease the level (for test purpose). The code is quite simple.

PC

The scale communicate with a computer to receive its level it should set. The level is computed from my work company issue tracking tool. The computation part code interface with some of my company API and is thus not part of the code…. You will have to code your logic in the python code in the function “getSeverity” which should return an integer between 5 (low level) and 1 (critical level) as the DEFCON standard 😉

The python part should be put in a crontab to regularly update the scale 😉

More picture of the project HERE and the code is HERE.

Quick table with leftover 😉

I had some old generation Arduino (Uno) and LED strip (white – non addressable) that I no longer need so I decided to use them to do a nice table. 

One push button to change the PWN and thus lighting of the table and a LED stip plug with a MOSFET driven by the arduino. 

Here is the result 

More pic here:  https://photos.app.goo.gl/ny27bXQZDyYSCEhB9

Very simple quick code here: https://bitbucket.org/charly37/ledtable/src/master/

Just wanted to do a quick post before working on the post of next project which is more fun. 

Yammer REST API

The goal is still to create a process which get a random cat picture and post it on a yammer group daily. Last week I tried with the Yammer connector available in MS flow but it does not support posting image as I explain in https://djynet.net/?p=945 . This time I decided to go one level deeper and us the Yammer REST API which support it. 

Yammer REST API 

We want to use the /messages POST REST call describe in the official REST API doc here: https://developer.yammer.com/docs/messages-json-post which mention the support of attachments “Yammer provides two methods to associate attachments with a message. Both make use of multi-part HTTP upload (see RFC1867)”. 

To be able to post we need to Authentify our self with the yammer Oauth2 flow describe here: https://developer.yammer.com/docs/oauth-2. I don t want to detail it too much since it’s pretty standard but basically our server offers and /login route which redirect to yammer.com.  

 app.get('/login', (req, res) => {
    var aLoginUri = "https://www.yammer.com/oauth2/authorize?client_id=CN737QnN3TCu2ooY7U2rbA&response_type=code&redirect_uri=https://djynet.xyz/callback";
    res.send(aLoginUri);
    console.log('Sent login URI response');
}); 

Then yammer.com call redirect back the user to our server on /callback route with a user token we can use from our server when querying the yammer.com API to post messages. 

 // OAuth2 endpoint (callback)
app.get('/callback', (req, res) => {
    res.end()
    console.log('Received Oauth login callback with code ' + req.url);

    //Calling Oauth to authenticate the APP
    var aUriAuthent = "https://www.yammer.com/oauth2/access_token?client_id=CN737QnN3TCu2ooY7U2rbA&client_secret=" + aClientSecret + "&code=" + req.query.code + "&grant_type=authorization_code";
    axios.post(aUriAuthent)
        .then((res) => {
            //console.log("Dumping response for debuging: " +res)
            //console.log("Dumping data from response for Debug: ", res.data)
            aAUthTest2 = res.data;
        })
        .catch((error) => {
            console.error(error)
        })
}); 

Getting random cat picture 

Of course there is an API for that 😉 https://thecatapi.com/ The API is free but you need to register to get a API KEY that you specify in your header when calling with ‘x-api-key’. The endpoint we need is https://api.thecatapi.com/v1/images/search that we call without any parameters and will give us a random cat url.  

 async function getCatUrl() {
    console.log('Entering getCatUrl');
    var aCatUrl = "https://upload.wikimedia.org/wikipedia/commons/4/4d/Cat_November_2010-1a.jpg";
    const aTemp = await axios.get("https://api.thecatapi.com/v1/images/search", { params: {}, headers: { 'x-api-key': aCatApiKey } });
    aCatUrl = aTemp.data[0].url
    console.log('Existing getCatUrl with: ', aCatUrl);
    return aCatUrl;
} 

One note here is the use of async/await for us to “wait” the response of catApi before we can proceed and post our picture in the yammer room. I will not detail await/asynch….so many good doc already (google it) 

Posting the image 

We now have a token and a cat picture URL that we can use to post our message. This is the only complicated part of this whole project due to “Both make use of multi-part HTTP upload (see RFC1867)”. I find this NPM module which should make this process easier: https://www.npmjs.com/package/form-data that we can use to create the “multipart/form-data” and then give to another module to send it to Yammer API. Here is the form part 

var formData = new FormData(); 
formData.append('attachment1', Request(aCatUrl));  

Which is quite straightforward as explain in their readme. Then we pass the form to another node module to send 

Axios 

The first module I tried to use to post the REST call is AXIOS: https://www.npmjs.com/package/axios which we use to get the random cat picture. Nevertheless, the documentation of form-data to use AXIOS has a bug which I was unable to understand so I open a bug report and switch to another library than Axios. The bug has now been fixed by a documentation change: https://github.com/form-data/form-data/issues/439 

Https 

Instead of axios we can use the native HTTPS nodejs module describe here: https://nodejs.org/api/https.html and pass him our form:  

// Patch header to add the key
var aHeader = formData.getHeaders();
aHeader['Authorization'] = "Bearer " + aAUthTest2.access_token.token;

var request = https.request({
    method: 'post',
    host: 'www.yammer.com',
    //very dirty.... did not find a way to pass param otherwise :(
    path: '/api/v1/messages.json?body=Cat%20of%20the%20day%20&group_id=7799980032',
    headers: aHeader
});

//send it
formData.pipe(request); 

Final touch 

I added a secret key in the postcat route to ensure nobody else will use it to spam the room with cat 

 if (req.query.key !== aPostCatSecretKey) {
        console.log("Invalid key: ",req.query.key," - send back 401")
        res.sendStatus(401);
    }

And then I added a crontab to call our API everyday 

0 1 * * * curl https://djynet.xyz/postcat?key=mysecretkey 

All code is here: https://bitbucket.org/charly37/catyammer/src/master/ 

And the result:  

Yammer connector

The goal is to create a MS office 365 Flow which will call a homemade connector to get a random cat picture and post it on a yammer group. 

Yammer connector 

The first thing to do is check if there is a MS Office365 connector that can get use a random cat picture (who know…. Maybe someone already done one). After checking https://flow.microsoft.com/en-us/connectors/ it seems it s not the case. So we have to create one.  

I will not go too much in the details since the MS doc is quite good https://docs.microsoft.com/en-us/connectors/custom-connectors/define-openapi-definition. One interesting point was the fact it needs to be HTTPS so I played with Let’s encrypt and their docker version of certbot…. Nothing fancy apart that…. classical NodeJS server with only one endpoint with an hardcode to start with 

app.get(‘/v2/cat’, (req,res) => { 

var aCatUrl = “https://i.imgur.com/x7X0Fxf.jpg“; 

res.send(aCatUrl); 

console.log(‘Sent response’); 

}); 

The full code is here https://bitbucket.org/charly37/catpy/src/master/ 

Once the connector is ready to be used you can validate it with a quick postman call like: 

Then you can add it on your MS 365 personal connector as explain in the MS doc. I used the “from OpenApi file” method with the following file https://bitbucket.org/charly37/catpy/src/master/myapp/swagger.json 

Flow creation 

Now that the connector is publish on your Office365 space you can create a flow using it. I copy past and existing template “Post an update to my company’s Yammer page” that I customize with an extra step to call my connector and post the response of the connector into the yammer room 

Then you can test it and see that the result is not exactly what we excepted…. 

I was hoping to have the picture posted on yammer like it is the case when I post the URL myself on the UI like  

The reason why it did not work is that the YAMMER connector do not allow to post picture. There is already an open request on MS side so that it is supported in the future (feel free to vote for it) here: https://powerusers.microsoft.com/t5/Flow-Ideas/Post-on-Yammer-with-image-attachments/idi-p/14300

Conclusion 

I will wait until MS implement that like I done for the JS function in Excel online 😉 and in the meantime I will try to do a REST call directly to Yammer API to see if I can post an image directly (without going throw flow). 

Let’s encrypt with docker

Few years ago i wrote this post to explain how to get a SSL certificate with let’s encypt: http://djynet.net/?p=795

They now have a docker version which is even more easy to use:

docker run -it --rm --name certbot -v "/etc/letsencrypt:/etc/letsencrypt" -p 80:80 -v "/var/lib/letsencrypt:/var/lib/letsencrypt" certbot/certbot certonly --standalone --email charles.walker.37@gmail.com -ddjynet.xyz

Interaction with jobs on remote kube cluster

This demo is made on a windows 10 computer. It shows how to interact with a Kube cluster in python and start a simple job on it and wait for the job to end and get its status/logs. The tricky part is to get the logs since the job object do not directly contains the info and thus, we need to get the pod associated with the job and get the pod logs. 

Kube setup (server) 

I use the Kube functionality of Docker for windows. Start Kubernetes which is part of Docker for windows 

Once the kube cluster is up and running you can interact with it from a terminal (I use PowerShell) that we will call T1 and will be use for the kube server-side interaction. 

Create service account 

PS C:\Users\charl> kubectl create serviceaccount jobdemo 
serviceaccount "jobdemo" created 

Get full permission to the SA (not clean but not the goal here) 

PS C:\Users\charl> kubectl create clusterrolebinding cluster-admin-binding --clusterrole cluster-admin --serviceaccount default:jobdemo 
clusterrolebinding.rbac.authorization.k8s.io "cluster-admin-binding" created 

Get secret token of the SA (from the secrets) 

PS C:\Users\charl> kubectl get secret jobdemo-token-jk59q -o json 

… 
"token": "ZXlKaGJHY2lPaUpTVXpJMU5pSXNJbXRwWkNJNklpSjkuZXlKcGMzTWlPaUpyZFdK...F6bDlKUUFGSF94Q3BvMVE=" 
… 

It s base64…decode it in a string and save it.

Python script setup (client) 

Start a new powershell terminal (let’s call it T2) to work on this part. Build the container from the dockerfile included in the repo 

PS C:\Code\kubejobs> docker build -t quicktest . 

Start the container and mount the repo in the container (not mandatory but allow to edit code in windows) 

PS C:\Code\kubejobs> docker run -it -v C:\Code\kubejobs:/mountfolder quicktest 

Export the token (the decode version of the base64 token we retrieved previously) 

[root@54a8362da7d1 mountfolder]# export KUBE_TOKEN=eyJhbGciOiJSUzI1NiIsImtpZCI...Es5howDOSTWqzl9JQAFH_xCpo1Q 

Start the python script 

[root@54a8362da7d1 mountfolder]# python3.6 kubeJobsDemo.py 
Starting 
Starting job 
Checking job status 
Job is still running. Sleep 1s 
Checking job status 
Job is still running. Sleep 1s 
Checking job status 
Job is still running. Sleep 1s 
Checking job status 
Job is still running. Sleep 1s 
Checking job status 
Job is still running. Sleep 1s 
Checking job status 
Job is still running. Sleep 1s 
Checking job status 
Job is still running. Sleep 1s 
Checking job status 
Job is over 
getting job pods 
Checking job status 
getting job logs 
Job is over without error. Here are the logs:  3.141592653589793 
Cleaning up the job 
Ending 
[root@54a8362da7d1 mountfolder]# 

You can also chech the job creation when the python script is running (but not after because job is deleted at the end) from the T1 terminal used before. 

PS C:\Users\charl> kubectl get jobs 
NAME      DESIRED   SUCCESSFUL   AGE 
pi        1         0            3s 

As you can see, we also print the logs of the job. I use this python script daily when I spawn jobs on a remote Kube cluster from a Jenkins server (my Jenkins jobs are just spawning Kube job on remote cluster and waiting for them to be over). I’m sharing it hoping it can help some ppl. 

The code is quite simple and the only tricky part is to get the pod associated to the job so that we can get the logs (BTW this may not works in case the job spawn several pods). 

The link Job-Pod is done with the use of selector since it was the recommended methode when I done the script (https://github.com/kubernetes/kubernetes/issues/24709) 

Full code is here: https://bitbucket.org/charly37/kubejobs/src/master/

My favorite way to work with linux container on windows10

Often i need to write code for Linux server on a win10 ENV. Before I was using VM (virtualbox/vagrant) but here is my new way to do it.

Create windows folder that will be mount: C:\Code\dockermount
Start the docker image and mount the folder in /mountfolder

docker run -it ubuntu:18.04 -v C:\Code\dockermount\:/mountfolder

You can now work on windows with your preferred editor and run the code in the linux ubuntu container

Docker clean up – win 10

PS C:\Code\Devenv> docker system df
TYPE TOTAL ACTIVE SIZE RECLAIMABLE
Images 10 7 3.236GB 2.386GB (73%)
Containers 30 0 1.256GB 1.256GB (100%)
Local Volumes 0 0 0B 0B
Build Cache 0 0 0B 0B

PS C:\Code\Devenv> docker container prune

PS C:\Code\Devenv> docker volume prune

PS C:\Code\Devenv> docker image prune -a #-a to remove all images

PS C:\Code\Devenv> docker system df
TYPE TOTAL ACTIVE SIZE RECLAIMABLE
Images 0 0 0B 0B
Containers 0 0 0B 0B
Local Volumes 0 0 0B 0B
Build Cache 0 0 0B 0B