Spacecraft Technology Center – HDMAX
Graduate Assistant Research, Software Engineer
Design hardware and software for the purpose of communicating with a payload onboard the International Space Station.
Fall 2003
References: Dr. James Ochoa, Dr. David Boyle
The Spacecraft Technology Center is a NASA Research Partnership Center. The mission of the STC is to provide the resources and skills necessary to give researchers and small businesses the opportunity to not only develop an idea, but to implement the idea in real hardware and software and test it in space. The STC is relatively new and still growing in capabilities and resources. I was brought on as a graduate research assistant to help the STC develop their custom avionics capabilities. Formerly the STC relied heavily on outsourcing for electrical and software needs, and instead focused primarily on the mechanical and thermal issues associated with space hardware. My only project while working at the STC was HDMAX. HDMAX was as project being done in conjunction with Florida Atlantic University. A laboratory at FAU has developed a small digital camcorder that provides extremely high resolution, approximately 4x that of IMAX in a MUCH smaller form factor. The desire is to use this camera on board the International Space Station to film commercial movie blurbs and experiments that require high resolution video.
Our portion of the project was to create the base station for this high resolution camera including solid state storage that operates at incredibly high data rates, multiple user interfaces, Health and Safety monitoring, and Command and Data Handling interfacing to the International Space Station and associated Ground Support Equipment. I was primarily responsible for architecting the interfaces between the camera’s control logic and each of the system peripherals/interfaces, and rapidly prototyping these interfaces to get quick feedback on human machine interface, develop the software architecture, and validate the data communication protocols before committing the system to an embedded environment, namely an FPGA. This was accomplished by creating complex multithreaded applications in LabVIEW to simulate each of the major entities in the system block diagram. This included the digital data recorder, user interface panel, camera head, the camera base control logic, and the EXPRESS Rack Interface Controller.
My other major area of responsibility was the Ground Support Equipment software and data communications with the payload. The number of hops that must be made to communicate between a payload and a remotely located operator is tremendous. Furthmore packet size is extremely limited, very low bandwidth, huge latency and jitter, and subject to long interuptions during Loss of Signal with the ISS. These restrictions complicate the project because HDMAX requires the ability to download high resolution images remotely. To facilitate large, reliable file transfers and remote control/debugging capabilities, I defined a communication protocol for HDMAX. This protocol exists within the data payload of all the other transport packet structures: TCP, Ethernet, EHS, CCSDS, and EXPRESS. The hops and associated packet structures are illustrated in the graphic below.
To simplify communication with the payload from the GSE software, I created an Application Programmers Interface represented by the yellow box at the top of the graphic labeled “HDMAX Software”. In actuality this box represents a server that can support multiple clients, thereby piping all of the input and output data through a single station. Although this architecture creates a single point of failure, it greatly reduces the interface requirements with the Payloads Operation Integration Center (POIC) and provides for the future capability to act as a broker of the shared resource (HDMAX) in the event of conflicting user commands. This API, created in VB.NET, abstracts the GUI developer from all of the communication and protocol details between the GUI and the payload. For example, the API exposes a method such as RecordImmediate(Duration). A GUI executes this method, and the API uses Telescience Resource Kit (TReK) to send the appropriate packet via TCP to EHS, which will eventually uplink the command. Upon receiving the command, the HDMAX payload will execute the instruction and respond with status information. If this status information is not received within a predefined period of time, the API automatically resends the instruction. However, the API is not limited to stop and wait functionlity, instead it uses a sliding window with selective retransmits. This allows a file transfer to keep the uplink pipe full while retransmitting only those packets that do not receive an acknowledge within the timeout period. The API fires events to notify the clients of newly arrived unexpected data, as well as indicate status information of unacknowledged packets. I also did some prototyping of a GSE GUI, but this was primarily for my own testing purposes during the development of the API. The API was not completed, but a functional skeleton was demonstrated.
This project was my first experience with the Aerospace and Defense industry. It required that I quickly become familiar with volumes of NASA, Boeing, and product specific documentation very quickly. It was like drinking from a fire hydron. I learned many new things from working on this project, I am very sorry I could not see it through completion. After I graduated the STC was not able to bring me on full time, therefore I was not able to continue my work on HDMAX. To my knowledge, my work is still being utilized in some capacity. However, with the exception of the GSE software, my software was never intended to be utilized in the final product on board the ISS. I hope I will someday get to work on a project that will eventually go into space so I can point up to the stars and say, “I made something that’s up there right now.” That would be cool! 🙂