End of Week 7 01/06/25
Another choppy week and an even later blog. Half-way through my family went on a ski trip, so I missed a couple of days. However, a couple of the sessions were extended and made up for lost time. This week was complicated, and I felt overwhelmed at times. I am doing my best to stay afloat. My dad has decided to gradually scale back the support he gives me. He will give me less help and do less hand-holding from now on. He assures me I am learning a lot, but with the amount of mistakes made and confusion had, it didn’t always feel that way. As always, I will try to describe what went down this week. Bear with me.
As I mentioned in the video, we purchased a wireless USB dongle for the robot. This allows the bot to wirelessly connect to the computer, and send telemetry back and forth. The first dongle didn’t work. The device driver on it was not available in the smaller version of linux manditory for the Mindstorms computer. So, we got a second one. That one worked, after some set up. Once the bot could ping the server on the computer, we added on to the mapper code demonstrated in the video. We accessed a library called urequests, a tool for creating and recieving mission data. With this factor at play, the robot could send telemetry to the server. Info like rotation angle, distance traveled, distance from wall, etc. The robot could be controlled by the computer via the server in the future. I used my knowledge of trig to figure out what the coordinates would be at the end of the missions relative to the origin (where the robot starts). It was a functional, local way to simulate a rudementary mapping program. At this point we had set up multiple separate parts, but hadn’t put them together yet.
Next, we started to use the flask server I set up last week. It was during this segment that I really started to have to figure out certain things on my own. The data from the posted missions appears on the site, and the missions are listed in a drop-down menu. At first I coded it to where the missions would appear as we posted them. Through this method, we had to manually change the “Mission” value. Like “Mission”: “1”, to “Mission”: “2”, and so on. Then we set up a “fake” scenario with random data. Using a number randomizer, the values for rotation angle, centimeters moved, distance, and the like, were random. This code would loop 10 times, ending at random coordinates and values. All ten missions were posted on the server and numbered one through ten. This was a way to more accurately simulate the mapper from a UI standpoint. To be more clear, this made for an easier way to test the UI we would eventually incorporate.
Speaking of which, the UI tool I decided to use is Bokeh. This is what we will use to draw the visual map on the screen as the bot goes. It doesn’t work right now, but we got the server to where it recognizes that Bokeh is being called.
I know that was a lot, and I am certain I used jargon where it didn’t make sense and my explanations were occasionally unclear. This project is what my dad refers to as a “firehose” of information. I am learning on the job. My understanding of code prior to my decision to start the Robotics Venture was rudimentary. I had taken courses in javascript and knew the basic ideas. I am now slightly proficient in python just through the stuff we have worked on so far. But overall, I am very new to most computer engineering concepts. This mapper project is hard for me to wrap my head around at times. But I am doing my best to get there. I hope whoever reads this is able to look past my technical shortcomings and see that.