In collaboration with Tektronix in Fall 2023, I worked alongside a team of students to create an augmented reality (AR) tool for visualizing electromagnetic waveforms using Microsoft's HoloLens.
Using C# and Unity, we rendered live waveform data in AR, allowing users to see electromagnetic waveforms in real time. Alongside the live data, our tool also displayed predicted future waveforms.
My main role on this team was to create the machine learning model that would create the predicted data. I used multiple libraries in Python - JAX, Keras, TensorFlow - to create a Recurrent Neural Network (RNN) model. By passing in datapoints from the live electromagnetic waveform, the RNN model outputs what is expected to be next in the sequence.
To support the data in this project, I created a MySQL database that held over 2.5 million data points - logging our live data for later use in evaluating our ML model's accuracy.
Throughout this project, we followed Agile methodologies, holding weekly scrums to coordinate progress and collaborate with our client.
During my final year of school, I saw an opportunity to sell digital goods for the video game Dark and Darker and founded darkanddarkergold LLC.
I built the business from the ground up, creating a Shopify website, deploying targeted ads, and optimizing the user experience that lead to an increase from 1% to 4% conversion rate.
By building a client base to over 5,000 customers, I needed more help running the day to day of my store - leading me to bring on and manage 8+ employees.
While this venture turned out profitable in the short term, the game shortly died down after its initial launch - leading me to close the site for good in late 2023.
This project applied LMSE models to analyze the relationship between fishing effort (days) and salmon harvest (fish caught) for Alaskan fishermen using linear regression with gradient descent. My model identified trends by minimizing error over multiple batch sizes for each different style of fishing - setnet, driftnet, dipnet, etc.. Above is the error loss for different batch sizes related to a type of fishing called setnet. The analysis of this data provided insights into the relationship between the number of days fished, and the type of fishing technique vs. total fish caught - allowing Alaskan fishermen to more efficiently reach their harvest counts.
I leveraged Python libraries like NumPy and Pandas for efficient data processing and model evaluation - making sure the project was scalable.
The project focused on predicting one specific stock's price using an LSTM-based model. It supported both historical data and live data predictions. Users can either create a new model or load a pre-existing model. The model is trained, evaluated, and tested on different datasets. Predictions for future stock prices are visualized and saved for further analysis. The project offers flexibility in terms of testing old models, working with live stock data, and forecasting prices over different time frames (7, 30, 90, 365 days).
To optimize the LSTM model, I experimented with different architectures and parameters, such as adjusting the number of layers and the learning rate. For this project I used Python's TensorFlow and Keras for efficient training, evaluation, and model deployment, ensuring that both historical and live stock data were handled seamlessly for accurate predictions across various time frames.
This project involved developing a perceptron model to classify night sky images based on the presence of the aurora. The dataset was manually labeled into three categories: aurora, no aurora, and uncertain. After extracting and normalizing color histograms (RGB), the perceptron was trained with various learning rates and batch sizes.
I found the optimal configuration (2000 epochs, 0.1 learning rate, max batch size) achieved up to 95% accuracy on test sets. While the model performed well on validated known and unknown images, accuracy varied for ambiguous "not known" images.
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.