Well part of this week has been spent offsite at the IEEE TechIgnite 2017 conference in San Francisco, so not much nighttime sky observing got done and besides the weather has not been the best. So, I want to provide a few comments about the conference, then recommend a couple of books to keep you busy while staying dry inside and then end up with some of the first light analysis for the indoor solar spectrum acquired last week with the LISA spectrograph.
The TechIgnite conference brought together a couple of hundred engineers to discuss a diverse set of topics including cyber-security, disruptive technologies like artificial intelligence, virtual and augmented reality, quantum computing and improved machine learning and their impacts on society and individuals. There were about 50 presenters and vendors present and I will not be able to summarize very well the many topics that were presented, but will try to summarize a few key points that struck me. Also, since I have not been involved with programming for many years now, I did find that being at the conference gave me a little more excitement and enthusiasm to complete my recently enrolled Python programming class. Other topics discussed, for example, included questions like, if self driving cars evolve to self driving trucks, then 3.5 million truck drivers could find themselves out of work? Other jobs could be impacted by ever increasing ability of artificial intelligence to perform better context analysis and decision making that surpasses the currently used systems that just follow a linear, pre-programmed script outlining possible paths imagined by the engineers doing the initial program. Newer systems are starting to include non-verbal cues and more learning and reasoning and will be much more effective and productive.
The first key note speaker was Grady Booch, IBM, who was interviewed on a wide range of topics regarding the impact of increasing artificial intelligence on society. Of course, Booch is well recognized for his huge impact in the object oriented programming language development. I can remember learning about "Booch Diagrams", etc., which eventually became incorporated in the Unified Modeling Language. He said he was not worried about the "singularity" taking over, but what would be the effect when the amount of human labor required to make all of the goods in the market is much less than the number of people looking for work. He also mentioned the Uber software fraud case and asked in the software engineers would object to be asked to doing work that was unethical, and urged the incorporation of more ethics classes in engineering curriculums.
Another speaker was, Steven Bay, who now is widely known at the boss who hired Edward Snowden. He told the story of how Snowden was hired this happened and the work environment and how no red flags surfaced until the fateful day they all learned what Snowden, who had taken "sick" leave just before the event, had done. He went on to talk about various cyber threats including the trusted insider threat.
Tony Jebara, Netflix, described how learning algorithms have evolved along many layers and how machine learning explores and exploits users experiences and expectations. It is like continuous observation while trying out different options and testing to see which approach yields the highest viewer satisfaction and of course highest video delivery success. He talked about the ethical nature of experimenting on large audiences and the tradeoffs that consumers face in granting access and losing some privacy in order to get the offered benefits. It's almost like the machine algorithms "watching" our every mouse click and selection can put together a description of each of us that can often result in the algorithm knowing us better than how some of your friends know you.
Danny Lange, Unity, presented on machine learning and how the general reinforcement learning algorithm is proving very effective in machines learning new behavior that was not pre-programmed and in fact that the programmers themselves had no idea how to solve some problem, but that the self learning software was able to figure out what to do by a combination of exploration and then exploitation of emerging learned opportunities. One of the more interesting demonstrations of self learning computer systems was a short video he showed of the robots being developed by Boston Dynamics. I couldn't find the exact video that he used in his presentation, but there are many online and I have pasted one of the shorter videos here. It is simply amazing to see what this robot can do!
Professor John Martinis (now that is a name I can get behind), UCSB and Google, described the latest results of Google's quantum computing project. He said that he is not that worried about quantum computing being able to break the current RSA encryption algorithms used widely for internet business transactions, because that is still not possible, but even if and when the capability is developed, there are already encryption algorithms available that cannot be broken by quantum computing. He described the quantum supremacy test just completed where nine qubits were used in a test to verify the predictions of quantum chemistry performed on everyday classical computers. This test with this small number of qubits required many years of development and operation of the quantum computer at very close to absolute zero temperatures to prevent interference from the environment resulting in decoherence of the quantum state huge errors. Google and many other industries are busy trying to develop quantum computers, but it will be several years more before the number of qubits gets up into the hundreds, at which time some really useful computations are thought possible. We will see!
Satyam Priyadarshy, Halliburton, described some of the productivity improvements occurring in the oil and gas business as the result of incorporating more digital integration of sensors and operations in far flung oil fields. The incorporation of data and big data analytics is creating increased productivity and increased safety in other large manufacturing industries also as described by William Ruh, GE. He described how integration of manufacturing processes and automation of windfarms, for example, have resulted in significant increases in efficiency and power generation. he also described how "digital twins", that is computer models, of aircraft jet engines, that process live data from aircraft engines, while in flight, have made better predictions of when maintenance is needed and are able to better prevent failures while in flight. In both of these industries, and supposedly in others also, the term "computation at the edge" is a concept being used more and more and it just means that computation is now more and more being done at the source where the data is being generated, rather than having the data sent back to a central site for processing and analysis there.
Finally, one of the keynote speakers at the conference was "The Woz", Steve Wozniak. The session chair doing the interview drew Steve into a discussion mostly of how he grew up and became interested in computers and how he met Steve Jobs and how they did what they eventually did. Woz was very animated and told many stories of his youth and how he enjoyed doing pranks, listening to music, learning about logic and computers and his early efforts at working with HP and Atari and early personal computer enthusiasts and developers.
|Steve Wozniak "fireside chat" captivates the crowd at IEEE TechIgnite2017 in San Francisco|
As Woz was regaling us with his stories of growing up and becoming interested in computers and computing, I reminisced about my own journey, which was essentially contemporaneous with him. The photo below, grainy and stained as it is, shows a young me next to my science fair project, which was supposed to demonstrate a mechanical mouse moving through a maze and learning the way and being able to do it correctly the 2nd time through. Well, the project never worked and probably couldn't work in hindsight. I remember the little motor driven mouse would go haywire and try to climb over the walls and even if it made its way though a couple of the maze's intersections, it would eventually lose its stored memory and forget where it was. This I tracked down to the many dozens of electromechanical relays which while functioning as memory cells, pulled more and more power as the number of learned steps increased, and my little power supply just couldn't keep them all energized and the weaker ones just gave up and dropped out and forgot what state they should have been in. You can just barely make out the mouse in the photo, which is just there as the umbilical power cable drops down to the mast mounted on the mouse. Anyway, Woz went on to become a millionaire and silicon valley icon and I just ended up where I'm at, but at least the journey has been fun.
|Resident Astronomer as young high school "engineer wannabe" with his electromechanical maze learning mouse|
(Source: Palmia Observatory)
|This field guide identifies March/April as ideal time to attempt the marathon|
this book by Langmuir and Broecker, "How to Build a Habitable Planet." Wow, if you like history, this book is about "big history" and begins with the big bang and covers everything else up till now. It looks pretty neat, so check it out!
|If you like big history stories, this is perhaps the biggest story|
Now, finally, let's discuss some of the preliminary analysis of the first light through the LISA spectrograph and the first attempt at calibrating the spectrum that was described in last weeks post. Remember that the spectrum was taken by just opening the light entry path to indoor room light, light just caused by sunlight with no other artificial lighting. So, it was not necessary to point the spectrograph at the sun or anything like that, just rely on the sunlight coming in through the office window. Then using the spectrograph built in neon lamp, one is able to calibrate the spectrograph to show how the measured light intensity varies with real wavelength. The analysis tool used for this calibration and analysis is, ISIS, which here stands for Integrated Software for Imaging and Spectroscopy, which is scary but not the danger of the terror oriented group.
Well, check out the final calibrated spectra below. Something is not right! The calibration method consists of taking three images of the source, one image of not light at all, just the dark current, one image with a plain flat field image and one other image of very short duration image to identify the bias and one calibration image of the neon lamp spectra. Then by assigning each of these seven images to the right location on the software calibration page, the final calibration spectrum, like that below, was calculated to be as shown. Well something did not work out. The red line in the screenshot if we believe the calibration is at 652 nm and part of the spectrum covering the entire visible range of wavelengths seems to be missing
|Oops, something not right with First solar spectra attempt using LISA and ISIS software (Source: Palmia Observatory)|
Now previous work here at the observatory with an inexpensive, lab bench top spectrometer, produced the spectral measurement below. This image was collected and described in posts from December timeframe in 2015. It turned out that the spectrometer was sensitive enough for laboratory analysis of chemical compounds and such, but was not even close to being sensitive enough for astronomical observations. But, notice how well the spectrum matches what we have come to expect for the solar radiation.
|Indoor solar spectrum evaluation performed with commercial bench top fiber optic spectrometer|
Source: Palmia Observatory
So, until that analysis gets done, that is about it for this week. If the weather forecast changes to more clear and less cloudy we might see you at Black Star Canyon.
Until next time,