MacOS High Sierra Media Key Enabler for iTunes and Spotify. You can prioritize which app you would like to control or you can go with the default behaviour which controls the running app.
The app runs in the menu bar in the form of a subtle and beautiful black dot.
Apple just released High Sierra and it brought good things and annoying things : they changed the behaviour of the media controller keys, they no longer control itunes, they control the video playback in safari. This pissed off a lot of people including me, so I just created a menu bar app to proxy media key events to iTunes/Spotify while Apple fixes this. It doesn't support touchbar yet, only physical buttons.
Back in 2007, when I was still young and greedy I reverse-engineered the communication between the flash player and the very expensive flash media server and created my own version of an RTMP server. I wanted to show off like never before, so I decided that the size of the binary should stay under 64Kbytes, because we came from the eighties :) And I did it, it was never flawless or completely bug-free but it worked well. After other open-source and closed-source RTMP servers appeared, and new features were added to RTMP I couldn't keep up so I just dropped the project. Check it out.
A lot of friends asked me recently to teach a little programming to their children. After a few occasions I've decided to start a video series on youtube about programming - for kids. I proudly present the first two episodes.
Do you have a super silent MacBook and sometimes you have no idea whether it is working or not? Or you just interested what tasks MacOS does in the background? Or you are a developer and you need to see what your application is doing in the background?
MacOS Activity Indicator puts an icon in your menubar that switches it's led between green and red when activity happens. If you click on the icon a window appears where you can see the last 100 files touched by the operating system.
In addition to the OAPKA update I have released a new mac app called MacOS Activity Indicator.
It started with fswatch. It's a super handy command line utility installable with brew ( brew install fswatch ), you simply type "fswatch /" into the terminal and no file activity remains hidden on your computer. You can see what nasty things MacOS and any other app are doing in the background. I use it to generate header files for my c files automagically no matter where they are.
Then I realized that I can create a menu bar activity indicator for slow and silent machines - like the new MacBook - because sometimes they seem to be unresponsive and you don't know what is happening. As a plus, I've added an observation window - you can open/close it by clicking on the menu bar icon - that shows the last 100 changed files.
I've decided to add no filtering or external app option - it's just a comfy indicator/quick check app - if you want to do serious work then use fswatch instead.
The app is open source in the pulbic domain as always :
Do you use an optical audio cable or HDMI to connect your mac and your amplifier? Do you experience one second silences in high fidelity music, lost words and lost sounds in movies? Well, it's not your nor your hardware's fault - it's an energy saving feature of these audio interfaces. And Audio Keepalive is the solution for you!!!
The app generates an absolutely unhearable tone that prevents your audio output to shut down at short silences/musical breaks, so you can enjoy an uninterrupted hi-fi audio stream.
Audio Keepalive is a menu bar application. Just click on the menu bar icon and select "Quit" to quit.
In the past year I've created a few very nice wrappers for different platforms and operating systems that wrap the same controller file to simplify multi-platform C development for my stuff. For your edification and pleasure they are now yours on github!
TMMCP is a wrapper/project collection for C programmers who like it quick & simple & with total control. A TMMCP wrapper provides:
an OpenGL context
device input events
native audio/video playback
native video to texture rendering
some other functions I found userful during game development
The wrappers are very simple, after a little reading you can easily extend them if you need some special functionality.
Current list of wrappers and projects
iOS - Xcode
MacOS - Xcode
Android - Android Studio
asmjs/html5 ( emscripten ) - bash script
What is the license?
Everything in this repo is in the public domain. Take it use it learn from it.
How to use it
Open "template/sources/controller.c" in your favorite editor and start coding. After you finished open the wanted platform's project file and build/compile/run/deploy. You may have to add newly created source files / include paths to the project settings.
Cool, do you have any documentation?
The documentation is in-line in controller.c. If you don't get it check out the demo projects. demo_dragbox shows a draggable white box over a purple background - it shows off basic OpenGL/input handling and audio playing. demo_conference is an advanced demo, it creates a 3D conference room with video avatars using the wrapper's render video-to-texture functionality - even in html5!. But not on android, I'm still in the middle of the implementation, it will show blank avatars.
Windows and Linux wrappers/projects would be awesome for start but any platform is welcomed warmly!
I'm using some stuff in the demos from these beautiful people :
I'm working on a multi-platform C and OpenGL based UI renderer and displaying things in 60 fps is essential.
It works well on desktop and mobile OS's and it looks good in WebGL on my non-retina Macbook Air but sadly on retina Macbook Pro's the framerate is dying in case of big browser windows ( ~more than 50% of the screen ).
After a few days of trial and research I figured out the following :
Google Chrome's webGL implementation on OS X is much faster than Safari's
enabling/disabling preserved drawing buffer makes no difference
using scissor test to draw only a fragment of the screen makes no difference
switching off antialiasing and the context's alpha channel makes difference
disabling alpha blending speeds up frame rendering big time
in os x's high and maximum display scaling mode the maximum framerate you can achieve with a full-size browser window with a full size canvas is 35-40 fps. In lower modes 60 fps is reachable.
full texture upload at any time kills performance
So what did I learn?
webGL will never be as fast as standard (windowed) openGL because the browser has to blend the webGL canvas into the web page with every frame and this slows down rendering big time
webGL is smart enough not to swap frame buffers if gl context is not changed
for a full-scale retina os x webGL UI renderer experience I have to wait for the next generations of MacBooks
Anyway I rewrote my UI renderer to use as few function calls as possible, use as much glTexSubImage2D and glBufferSubData calls as possible instead of full uploads and now it uses bitmaps for text fields instead of individual textured quads for letters and I'm almost satisfied.
So you just take previously written C/OPENGL games/prototypes and you compile them for the BROWSER!!! It's madness!
I don't have the nerve for web programming, for tons of DOM elements and css, for different js vm implementations, for debugging hell and everything else modern web "gave" me as a developer. And now I don't have to deal with all these things, and still I can deploy for the browser!!! ( okay, I have to deal with them a little because it still is web development, but hey, I only need hours now to fix something strange, not days!!! )
Last week I badly wanted to see my face as a voxel cloud on the screen so I entered the 3D scanning territory. There are two ways for a mid-class person to do it in an affordable way : reconstruction from photo series or triangulation based on a laser line. The first method is quite inaccurate and needs complex algorithms ( Autodesk123D is the biggest free solution, ), the seconds one is dead simple, accurate but it is very hard to add texture data. So I chose the latter. It needs only two things : a line laser and my mobile's camera.
So I went to the local home improvement store and bought a Bosch Quigo laser level, set up a simple scene and started coding.
The theory : you know the distance between the laser emitter and the camera lens focus point, you know the angle between the laser and the camera axis and you also know the field of view of the camera. From these data you can link an angle to every pixel of an image created by the camera, and from that angle you can tell the distance from the camera.
The red line is the laser, the blue line is the camera axis. d1 is the distance between them, it is set by you. a1 is the angle of the camera axis, it is also set by you. The black line is the line between the camera and the touch point of the laser on an object. The yellow line is the projection plane of the camera, the green lines are the field of view of the camera. The field of view is also known, you can check it up at your phones vendor or calculate it based on the focus length and sensor size ( in case of my iPhone 6s they are 4.15mm and 4.5mm so the focal angle is atan( ( 4.5/2.0 ) / 4.15 ) * 2 so roughly 56 degrees when you hold it horizontally ) or put down a one meter width something on the ground and take a picture of it from 2 meters away and do the math based on the image.
So the point is that the black line will have a corresponding pixel on the image, you know angle a2 ( which is the field of view divided by two ) so you can calculate angle a3 since they have a linear correlation to the pixel count. So ( image width / 2 ) / ( FOV / 2 ) = ( black dot pixel distance from center / wanted angle ). And after you have angle a3 you know the angle between the red line and the black line and a cosine function will give you the length of the black line : cos(angle) = d1/ wanted length.