Using a 6522 to control an ATF22V10 to select address lines.
Sure, it will take some clock cycles, but it should work.
CODE:
LDA #$0F STA VIA_DDRA LDA #$01 STA VIA_ORA NOP .. wasted nops? How many? LDA BANK1
(Improvement of previous posted script)
Place below bash script in a subdir with media.
Animated GIFs are generated from the video files.
You can set the number of frames per length
#!/bin/bash
#set -x
mkdir -p tmp prev
: > list
: > index.html
find ../ -type f -print | egrep -i "mp4$|wmv$|avi$|mpg$|mpeg$|flv$|mov$|divx$" > list
cat list | while read movie; do
rm -f tmp/*
newname=$( echo $movie | tr -d ' /.[]{}()' )
if [ ! -f prev/${newname}.gif ] ; then
echo "Filename : $movie"
kseconds=$( mediainfo --Inform="Video;%Duration%" "$movie" )
minutes=$(( $kseconds / 60000 ))
echo "Minutes : $minutes"
if [ $minutes -gt 10 ] ; then
rate=0.032
else
rate=0.128
fi
echo "ffmpeg -hide_banner -loglevel error -i $movie -r $rate -vf scale=640:-1 tmp/output_%04d.png"
ffmpeg -hide_banner -loglevel error -i "$movie" -r $rate -vf scale=640:-1 tmp/output_%04d.png < /dev/null
# remove first (most of the time just black or logo)
rm tmp/output_0001.png
echo -n "Frames : "
ls tmp/out* | wc -l
convert -delay 50 -loop 0 tmp/output*.png prev/${newname}.gif
else
echo "$movie exists ... skipping"
fi
echo "<h1>${movie}</h1><br>" >> index.html
echo "<img src=\"prev/${newname}.gif\"><br>" >> index.html
done
exit 0
run and get something like below (output is still running as I made this post)
In the past, I’ve played with a standard lidar device.
Now it is time to check out a 360 version.
This one is very small (40mm x 40mm x 35mm)
Provided examples didn’t work. (People with same error on the Github issues tracker page had the same)
I changed the python script so it worked also with this YDLidar T-mini Plus version.
Next to-do, put this on my robot car.
Code:
import os
import ydlidar
import time
import sys
from matplotlib.patches import Arc
import matplotlib.pyplot as plt
import matplotlib.animation as animation
import numpy as np
RMAX = 32.0
fig = plt.figure()
lidar_polar = plt.subplot(polar=True)
lidar_polar.autoscale_view(True,True,True)
lidar_polar.set_rmax(RMAX)
lidar_polar.grid(True)
ports = ydlidar.lidarPortList();
port = "/dev/ttyUSB0";
for key, value in ports.items():
port = value;
laser = ydlidar.CYdLidar();
laser.setlidaropt(ydlidar.LidarPropSerialPort, port);
laser.setlidaropt(ydlidar.LidarPropSerialBaudrate, 230400);
laser.setlidaropt(ydlidar.LidarPropLidarType, ydlidar.TYPE_TRIANGLE);
laser.setlidaropt(ydlidar.LidarPropDeviceType, ydlidar.YDLIDAR_TYPE_SERIAL);
laser.setlidaropt(ydlidar.LidarPropScanFrequency, 10.0);
laser.setlidaropt(ydlidar.LidarPropSampleRate, 4);
laser.setlidaropt(ydlidar.LidarPropSingleChannel, False);
laser.setlidaropt(ydlidar.LidarPropMaxAngle, 180.0);
laser.setlidaropt(ydlidar.LidarPropMinAngle, -180.0);
laser.setlidaropt(ydlidar.LidarPropMaxRange, 16.0);
laser.setlidaropt(ydlidar.LidarPropMinRange, 0.02);
laser.setlidaropt(ydlidar.LidarPropIntenstiy, True);
scan = ydlidar.LaserScan()
def animate(num):
r = laser.doProcessSimple(scan);
if r:
angle = []
ran = []
intensity = []
for point in scan.points:
angle.append(point.angle);
ran.append(point.range);
intensity.append(point.intensity);
lidar_polar.clear()
lidar_polar.scatter(angle, ran, c=intensity, cmap='hsv', alpha=0.95, marker=".")
ret = laser.initialize();
if ret:
ret = laser.turnOn();
if ret:
ani = animation.FuncAnimation(fig, animate, interval=50)
plt.show()
laser.turnOff();
laser.disconnecting();
plt.close();
3D printed a little light case for a wemos and a piece of WS2812 led strip I had lying around.

Schematic:
NOTE: The resistor is 100-500 ohm (I forgot, just try)
You can only use this trick for a few leds (I used 4), else you better can use the sacrifice a led to make a level shifter trick.
(Wemos logic is 3.3V and the led strip is 5V)
I flashed ESPHome on the wemos using the flasher in Home Assistant.
Code:
esphome:
name: matternotification
friendly_name: matternotification
esp8266:
board: d1_mini
# Enable logging
logger:
# Enable Home Assistant API
api:
encryption:
key: "ogFxZUXerNxxxxxxxxxxxxxxxxxWaWyJVxCM="
ota:
- platform: esphome
password: "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
wifi:
ssid: !secret wifi_ssid
password: !secret wifi_password
# Enable fallback hotspot (captive portal) in case wifi connection fails
ap:
ssid: "Matternotification"
password: "rxxxxxxxxxxxxxxx"
captive_portal:
light:
- platform: neopixelbus
type: GRB
variant: WS2812
pin: D4
num_leds: 4
name: "NeoPixelMattermost"
To get the status of messages and controlling the HA entity, I made a bash script.
First curl command to get a token from Mattermost using the API.
Second curl command to get messages states from Mattermost.
Bottom two curl command turn a light entity on or off in your Home Assistant server using a API
#!/bin/bash
#set -x
# change : mattermost username and password (and server)
# change : mattermost userid and teamid
# change : home assistant long time token (and HA server)
# change : light entity
#
while true; do
# Get token using login
#token=$(curl -s -i -X POST -H 'Content-Type: application/json' -d '{"login_id":"username","password":"password"}' https://mattermostinstance.com/api/v4/users/login | grep ^Token | awk '{ print $2 }' | tr -d '\r' )
#using a MM auth token (see below)
token=xxxxxxxxxxxxxxxxxxxx
# Get messages
# Gives you something like
# {"team_id":"j3fd7gksxxxxxxxxxxxxxjr","channel_id":"rroxxxxxxxxxxxxxxtueer","msg_count":0,"mention_count":0,"mention_count_root":0,"urgent_mention_count":0,"msg_count_root":0}
# We need to count ":0"
messages=$(curl -s -i -H "Authorization: Bearer ${token}" https://mattermostinstance.com/api/v4/users/ou5nz5--USER-UUID--rbuw4xy/channels/rropejn--TEAM-ID--tueer/unread | grep channel
| grep -o ":0" | wc -l)
# If 5 times ":0" then no messages
if [ $messages == 5 ] ; then
# Turn off
curl -s -X POST -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cC--HOME-ASSISTANT-LONG-TIME-TOKEN-CBusTgTUueWpPNdH5WAWOE" \
-H "Content-Type: application/json" \
-d '{"entity_id": "light.matternotification_neopixelmattermost_2"}' \
http://192.168.1.2:8123/api/services/light/turn_off > /dev/null
else
# Turn on
curl -s -X POST -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cC--HOME-ASSISTANT-LONG-TIME-TOKEn--CBusTgTUueWpPNdH5WAWOE" \
-H "Content-Type: application/json" \
-d '{"entity_id": "light.matternotification_neopixelmattermost_2"}' \
https://192.168.1.2:8123/api/services/light/turn_on > /dev/null
fi
sleep 5
done
Get a Long-lived access token from HA:
Profile > Security and Create Token
Create a token in Mattermost:



I’ve got a little SDR stick a while ago with some antenna’s.

I’ve got some extra antenna’s
So I was playing around with P2000 and Airplane Radio (tracking overhead planes)

For debugging I used SigDigger
dump1090-mutability --aggressive --interactive --net --net-http-port 8080 --net-sbs-port 30003 git clone https://github.com/Zanoroy/multimon-ng.git cd multimon-ng/ mkdir build cd build cmake .. make rtl_fm -f 169.65M -M fm -s 22050 -p 83 -g 30 | ./multimon-ng -a FLEX -t raw /dev/stdin dump1090-fa --interactive

I bought a cheap esp32 display (4 inch 480×480) from China.
It also has three relays to control lights.
Below is a gallery with default screens.










Flashing OpenHasp was a breeze.
Configure the thing for HA was not so easy.


Install OpenHasp via Hacs on HA.
There is a webinterface on the display, here you have to configure wifi, mqtt and the pages.






Config files used for this first test:
{"page":1,"id":34,"obj":"img","src":"L:/pcb480x480.png","auto_size":0,"w":480}
{"page":1,"id":1,"obj":"btn","x":0,"y":0,"w":480,"h":50,"text":"IOTDesigns","value_font":22,"bg_color":"#2C3E50","text_color":"#FFFFFF","radius":0,"border_side":0}
{"page":1,"id":2,"obj":"btn","x":10,"y":60,"w":105,"h":90,"toggle":true,"text":"\uE335","text_font":32,"mode":"break","align":1}
{"page":1,"id":3,"obj":"dropdown","x":10,"y":160,"w":170,"h":60,"options":"Option 1\nOption 2\nOption 3\nOption 4"}
{"page":0,"id":1,"obj":"label","x":375,"y":45,"h":40,"w":100,"text":"00.0°C","align":2,"bg_color":"#2C3E50","text_color":"#FFFFFF"}
{"page":1,"id":6,"obj":"slider","x":20,"y":300,"w":440,"h":40,"min":15,"max":85}
Designer at : https://haspdesigner.qrisonline.nl/
4inchdisplay2:
objects:
- obj: "p0b1" # temperature label on all pages
properties:
"text": '{{ states("sensor.livingtemperature") }}°C'
- obj: "p1b2" # light-switch toggle button
properties:
"val": '{{ 1 if states("switch.livingshelly") == "on" else 0 }}'
"text": '{{ "\uE6E8" if is_state("switch.livingshelly", "on") else "\uE335" | e }}'
event:
"up":
- service: homeassistant.toggle
entity_id: "switch.livingshelly"
- obj: "p1b3" # dropdown
event:
"changed":
- service: persistent_notification.create
data:
message: I like {{ text }}
- obj: "p1b6" # Light brightness
properties:
"val": "{{ state_attr('number.dinnertable_brightness_0', 'brightness') if state_attr('number.dinnertable_brightness_0', 'brightness') != None else 0 }}"
event:
"changed":
- service: light.turn_on
data:
entity_id: number.dinnertable_brightness_0
brightness: "{{ val }}"
"up":
- service: light.turn_on
data:
entity_id: number.dinnertable_brightness_0
brightness: "{{ val }}"
NOTE: Dimmer is not working via HA (yet), but mqtt messages are working.

| obj | Type | Description | Extra Parts |
|---|---|---|---|
| btn | Binary | Button | |
| switch | Toggle | Switch | indicator, knob |
| checkbox | Toggle | Checkbox | indicator |
| label | Visual | Label | |
| led | Visual | LED | |
| spinner | Visual | Spinner | indicator |
| obj | Visual | Base Object | |
| line | Visual | Line | |
| img | Visual | Image | |
| cpicker | Selector | Color picker | knob |
| roller | Selector | Roller | selected |
| dropdown | Selector | Dropdown List | selected, items, scrollbar |
| btnmatrix | Selector | Button Matrix | items |
| msgbox | Selector | Messagebox | items, items_bg |
| tabview | Selector | Tabview | items, items_bg, indicator, selected |
| tab | Selector | Tab | |
| bar | Range | Progress Bar | indicator |
| slider | Range | Slider | indicator, knob |
| arc | Range | Arc | indicator, knob |
| linemeter | Range | Line Meter | |
| gauge | Range | Gauge | indicator, ticks |
| qrcode | Visual | Qrcode |
We’ve gone too far.
IA is generating news articles and complete YT video’s.
Also, forums and news articles are made using AI.
Reviews are being generated by vote farms.
Unchecked and being re-ingested by other IA scrapers.
It’s being fed again into other new AI generators.
Content generators are not interested in if the generated content is true.
Just generate traffic and income.
I hate watching a long YT video, voice being generated, story content from ChatGPT and not fact checked.
No new information, just generic information stretched into more time you have to watch.
If there is a disaster, people generate false footage to generate traffic.
I’ll resume this rant in time .. i’m not done
I’ve got a flipper zero at last.
https://flipperzero.one/
I know, it’s more an useful toy than a serious tool.
It’s too limited. But useful for me.
Learning about tools and sub gigahertz monitoring.
I hoped to get a BFFB for it, that will be a big plus.
https://www.justcallmekokollc.com/product/flipper-zero-bffb/31
One of the first things was reflashing the device with Momentum firmware.
I’ve ordered a Wi-Fi Dev Board, so I can use Marauder.
Here are some qFlipper screenshots.





Will add pictures and info about the Wifi dev board.
Some information:
The Flipper Zero is a versatile multi-tool for geeks, hackers, and hardware enthusiasts. It is designed as a portable, open-source device with numerous capabilities for interacting with digital systems and hardware. Here’s an overview of what the Flipper Zero can do:
The Flipper Zero is a powerful tool, but its legality depends on how it is used. Be sure to respect laws and ethical guidelines when exploring its capabilities.
To display quotes, changing once per hour.
There is not much to be found for Waveshare 4.2 Epaper.
Except for an Arduino web example.
( see https://www.waveshare.com/wiki/E-Paper_ESP32_Driver_Board )
I reversed engineered the workings, and created a python upload script to push images.
Original workings are a mess.
Per 4 bit of color, high-low switched in a byte.
Black and red separated.
Using a till p encoding over curl commands.
My implementation uses a python script called as:
python3 epaper-pusher.py ~/Downloads/Untitled.png


http://10.1.0.99/EPDI_ 30 times something like http://10.1.0.99/ppppppppppppppppppppppppppppppppppppppppppppppppppppppaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaabbbbbbbbbbbbbbbbppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppiodaLOAD_ http://10.1.0.99/NEXT_ 30 times something like http://10.1.0.99/pbcdefghijjjjjjffffffoooooooaaabbbbbbeeeedddppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppiodaLOAD_ http://10.1.0.99/SHOW_
NOTES: a = 0000 - - - p = 1111 = 15 30 lines with 1000 bytes ( ending with iodaLOAD_ ) black pixels first block 1 second block 0 red pixels first block 0 second block 1 white pixels first block 1 second block 1 PIXEL Example RBRB BWBW First block 1010 - letter K 0101 - Letter F - second nibble = white Second block 0101 - Letter F 1111 - Letter P - second nibble white
Code
from PIL import Image
import numpy
import requests
url="http://10.1.0.99/"
black_pixels = numpy.zeros((400,300))
red_pixels = numpy.zeros((400,300))
def classify_pixel_color(pixel):
"""
Classify a pixel as black, white, or red.
"""
r, g, b = pixel[:3] # Ignore alpha if present
# Define thresholds for classification
if r < 128 and g < 128 and b < 128:
return 'black'
elif r > 200 and g > 200 and b > 200:
return 'white'
elif r > 128 and g < 100 and b < 100:
return 'red'
else:
return None
def process_image(image_path):
"""
Process the image and classify its pixels into black, white, or red.
"""
image = Image.open(image_path)
image = image.convert("RGB") # Ensure the image is in RGB mode
width, height = image.size
pixel_data = image.load()
color_counts = {'black': 0, 'white': 0, 'red': 0}
for y in range (0, 299):
for x in range (0, 399):
black_pixels[x][y] = 0
red_pixels[x][y] = 0
for y in range(299):
for x in range(399):
color = classify_pixel_color(pixel_data[x, y])
if color:
color_counts[color] += 1
if color == 'black':
black_pixels[x][y] = 1;
if color == 'red':
red_pixels[x][y] = 1;
if color == 'white':
black_pixels[x][y] = 1;
red_pixels[x][y] = 1;
return color_counts, black_pixels, red_pixels
def number_to_letter(num):
"""
Translates a number from 0 to 15 into a corresponding letter (a-p).
Args:
num (int): The number to translate.
Returns:
str: The corresponding letter (a-p).
"""
if 0 <= num <= 15:
return chr(ord('a') + num)
else:
raise ValueError("Number must be between 0 and 15, inclusive.")
def print_array_in_chunks(array, chunk_size=1001):
current_chunk = ""
for item in array:
# Convert item to string and add to the current chunk
item_str = str(item)
if len(current_chunk) + len(item_str) + 1 > chunk_size:
# Print the current chunk and reset it
current_chunk += "iodaLOAD_"
try:
requests.get(url + current_chunk, verify=False)
if not response.content: # Equivalent to expecting an empty reply
pass
except requests.exceptions.RequestException as e:
# Catch any request-related errors
pass
current_chunk = item_str
else:
# Append the item to the current chunk
current_chunk += (item_str)
current_chunk += "iodaLOAD_"
# Print any remaining items in the chunk
if current_chunk:
try:
requests.get(url + current_chunk, verify=False)
if not response.content: # Equivalent to expecting an empty reply
pass
except requests.exceptions.RequestException as e:
# Catch any request-related errors
pass
def switch_in_pairs(arr):
# Loop through the array with a step of 2
for i in range(0, len(arr) - 1, 2):
# Swap values at index i and i+1
arr[i], arr[i + 1] = arr[i + 1], arr[i]
return arr
if __name__ == "__main__":
import sys
if len(sys.argv) < 2:
print("Usage: python3 script.py <image_path>")
sys.exit(1)
image_path = sys.argv[1]
try:
color_counts, black_pixels, red_pixels = process_image(image_path)
try:
requests.get(url + "EPDI_" , verify=False)
if not response.content: # Equivalent to expecting an empty reply
pass
except requests.exceptions.RequestException as e:
# Catch any request-related errors
pass
lines=[]
for y in range(300):
for x in range(0,399,4):
first = red_pixels[x][y]
second = red_pixels[x+1][y]
thirth = red_pixels[x+2][y]
fourth = red_pixels[x+3][y]
nibble = 0
if (first == 1):
nibble = nibble + 8
if (second == 1):
nibble = nibble + 4
if (thirth == 1):
nibble = nibble + 2
if (fourth == 1):
nibble = nibble + 1
lines.append(number_to_letter(nibble))
switched_array = switch_in_pairs(lines)
print_array_in_chunks(switched_array)
try:
requests.get(url + "NEXT_" , verify=False)
if not response.content: # Equivalent to expecting an empty reply
pass
except requests.exceptions.RequestException as e:
# Catch any request-related errors
pass
lines=[]
for y in range(300):
for x in range(0,399,4):
first = black_pixels[x][y]
second = black_pixels[x+1][y]
thirth = black_pixels[x+2][y]
fourth = black_pixels[x+3][y]
nibble = 0
if (first == 1):
nibble = nibble + 8
if (second == 1):
nibble = nibble + 4
if (thirth == 1):
nibble = nibble + 2
if (fourth == 1):
nibble = nibble + 1
lines.append(number_to_letter(nibble))
switched_array = switch_in_pairs(lines)
print_array_in_chunks(switched_array)
try:
requests.get(url + "SHOW_" , verify=False)
if not response.content: # Equivalent to expecting an empty reply
pass
except requests.exceptions.RequestException as e:
# Catch any request-related errors
pass
except Exception as e:
pass
Yesterday I got my Home Assistant Voice!
This is a Non-Cloud solution like Alexa and Google devices.
I only could play with it for a few minutes because I was working on Arduino code with an ILI9341 Display and a BME280 (Temperature/Humidity/Air pressure).
Today I got some new goodies in, one of these is a LilyGO LoRa display which works on 433 Mhz.
I flashed OpenMQTTGateway on this device.
In the past, I posted about the RFCOM Gateway using Domoticz.
This runs on a Raspberry Pi.
While looking for alternatives, I found a rtl-sdr solution.
https://github.com/merbanan/rtl_433
Using this:
But I liked the ESP32 solution more.
Now I can dismantle Domoticz, which served me well for many years.
Note: This is a receiver device only!
But I only use read-only sensors like : Door/window, doorbell, temperature/humidity and Firesensors.
These are automatically detected in Home Assistant.
No more RFXCOM with a Raspberry.