I started off the day with an Indiegogo update and a Facebook post with the photos from the launch. Since it was already late in the morning, as soon as I finished this it was time to go into town to buy the G15.
The camera in question is a Canon PowerShot G15 and was my first choice when looking for cameras to buy. However all the prices for this camera were way outside of my price range, sitting at around R3000 or more. However, when posting my ad for the SX430IS on Gumtree, I saw a G15 going for R1500, which is unheard of. I set up a meet with the seller in town, so that is where I went first.
After a bit of lateness and running around trying to find each other, we met up at Bakoven where we sat down, and I got to see the camera for the first time. It was exactly as advertised, and I couldn’t find any defects or anything wrong with the software. It was a 32GB version, which was a very nice bonus. R1500 later the camera was in my happy hand heading back home. My mom needed to stop off in Woodstock to look at some furniture, so I used the time to start writing up my accounts of the launch on Thursday.
When we got home, I found my dad working on a Mission Report Dashboard that he had been creating, which took the data from the previous two launches and from it works out all the possible information one could want from the launch, ranging from the maximum speed, to the flight time, to the falling rate, as well as an elevation profile and the drawn path of the predicted route versus the actual route. All of this was created automatically just from the data received from the flight.
This is a very useful tool, as it will enable me to analyze the flight in order to improve future flights and it will make for some nice content to share with those who are interested.
The dashboard will be linked to the tracker, and once it has detected that the payload has landed, it will save the data and automatically create the report for that launch. In addition to this, the tracker is getting an upgrade and will automatically detect when the balloon has been launched and when it lands, as well as creating predictions every minute of where the landing spot will be, making the locating process a lot better. The software will also detect when the balloon has burst and put in a burst icon to represent that.
My dad and I were thinking of potentially turning all of this into an app, which would be able to access the phone’s hardware, allowing it to use the accelerometer, compass, GPS and camera. With these we would be able to make an AR tracker, which would use the phone’s camera to display what it records, while the GPS would determine where the phone is in relation to the Balloon. The accelerometer and compass would be able to direct the use to where the balloon is in the sky, using trigonometric calculations.
Of course at this point this is just an idea which will probably not turn into anything in the time frame we have.
We also spoke about the design of the housing MKIII. It is apparent that a conical hole needs to be cut out instead of a cylindrical one for greater field of view. In order to achieve this, we can either cut a hole in the polystyrene, or we can make a mould which we fill with Polyurethane, which is apparently stronger and lighter than Polystyrene, due to its elasticity and sponginess. This could make an excellent MKIII, however it will take extra effort and time to learn how to do it and to then build it. I don’t think I will be able to get it in done before the next launch, but possibly before launch 4.
I will monitor the Facebook posts and see how that goes, hopefully I could get a bit more funding.




