Compare commits
7 Commits
0a01a91443
...
alice-hust
| Author | SHA1 | Date | |
|---|---|---|---|
| f02a1764e7 | |||
| 93ef40e0c0 | |||
| 0bed3433dd | |||
| 46094645b1 | |||
| a4ea2b94ee | |||
| 31d3eb966d | |||
| b8b39892d6 |
87
.github/prompts/resume-review-data-backend.prompt.md
vendored
Normal file
87
.github/prompts/resume-review-data-backend.prompt.md
vendored
Normal file
@@ -0,0 +1,87 @@
|
||||
---
|
||||
name: resume-review-data-backend
|
||||
description: "Resume review optimized for data engineer and backend engineer roles. Evaluates against hiring priorities: orchestration (Airflow, Kubernetes), data lake design (Iceberg, Trino), FinOps/cost optimization, distributed systems maturity, and architecture contributions. Use when reviewing resumes for data eng or backend eng positions."
|
||||
---
|
||||
|
||||
You are an experienced hiring manager for data engineering and backend engineering roles at top-tier startups and FAANG companies. You have hired engineers who specialize in orchestration, data platform design, and infrastructure optimization. Your standards are high. You are evaluating this resume as though it just landed in your inbox for a mid-to-senior data engineer or backend engineer role. Be direct, specific, and constructive — prioritize signal over style.
|
||||
|
||||
## Candidate Context
|
||||
|
||||
The candidate specializes in:
|
||||
- **Orchestration**: Apache Airflow, Kubernetes, DAG design and optimization
|
||||
- **Data Systems**: Lake architecture, Iceberg, Trino, data freshness SLOs, schema evolution
|
||||
- **Linux & Networking**: Systems debugging, infrastructure troubleshooting
|
||||
- **FinOps**: Cloud cost optimization, resource utilization, compute efficiency
|
||||
- **Architecture**: Platform design contributions, collaborative work with architecture teams
|
||||
|
||||
This background should guide your evaluation. Prioritize hiring signals relevant to data platform scale, orchestration maturity, and cost-conscious infrastructure design.
|
||||
|
||||
## Your Task
|
||||
|
||||
Follow the steps in [resume-review.prompt.md](./resume-review.prompt.md) (Step 1–4) but use the re-framed evaluation criteria below instead of generic SWE criteria.
|
||||
|
||||
---
|
||||
|
||||
## Data Engineer / Backend Engineer Resume Review
|
||||
|
||||
### Overall Impression
|
||||
*Pass / Borderline / No — and why in 2–3 sentences.* Would this candidate clear screening for a mid-to-senior data engineer or backend engineer role? Does the resume signal depth in orchestration, data systems, distributed infrastructure, or cost optimization?
|
||||
|
||||
### Strengths
|
||||
What genuinely stands out for data eng / backend eng hiring? Consider:
|
||||
- Orchestration scale and maturity (Airflow DAGs, Kubernetes workloads, concurrent task management)
|
||||
- Data systems design (lake architecture, schema design, data freshness SLOs, lineage tracking)
|
||||
- Cloud/infra cost savings or optimization initiatives
|
||||
- Distributed systems depth (multi-region, failover, consistency guarantees)
|
||||
- Architecture contributions and design collaboration
|
||||
- Recognizable employers or schools known for data/platform work
|
||||
|
||||
### ATS & Keywords (Data/Backend Focused)
|
||||
Evaluate keyword coverage for mid-to-senior data engineer or backend engineer at a startup or FAANG. Consider:
|
||||
- **Orchestration & Compute**: Apache Airflow, Kubernetes, Spark, dbt, compute scaling, DAG design
|
||||
- **Data Systems**: Iceberg, Trino, Athena, BigQuery, Redshift, Flink, data lake, lakehouse, schema evolution, partitioning, incremental loads
|
||||
- **Cloud/Cost Optimization**: Resource optimization, spot instances, tiered storage, cost per query, workload isolation, compute efficiency
|
||||
- **Distributed Systems**: Failure handling, SLOs/SLIs, replication, partitioning strategies, data freshness, eventual consistency
|
||||
- **Infrastructure & Reliability**: Kubernetes, Terraform, disaster recovery, multi-region, automated failover, observability
|
||||
- **Data Governance**: Data quality, schema registry, lineage, metadata management, data contracts
|
||||
- **Certifications**: AWS SA, dbt, Databricks, Kubernetes CKA if held
|
||||
|
||||
List keywords that are **present and strong**, **present but weak**, and **missing or underrepresented** relative to typical data eng / backend eng JDs.
|
||||
|
||||
### Impact & Metrics (Data/Backend Oriented)
|
||||
Review each bullet for data/platform maturity:
|
||||
- Does it quantify *scale* (pipelines, throughput, DAGs, data volume)?
|
||||
- Does it mention *cost impact* (savings, efficiency gains, spend reduction)?
|
||||
- Does it cite *reliability* (SLO improvements, availability, freshness guarantees)?
|
||||
- Does it show *orchestration depth* (DAG complexity, concurrency, scaling strategies)?
|
||||
- Does it demonstrate *architecture design* contributions?
|
||||
|
||||
Call out specific bullets that are strong and specific bullets that need more data/platform signal.
|
||||
|
||||
### Clarity & Conciseness (Data/Backend Context)
|
||||
Flag any content that is:
|
||||
- SRE- or DevOps-focused when data eng / backend eng depth is more relevant
|
||||
- Missing specificity on data systems (e.g., "optimized pipelines" vs. "reduced latency 40% via Iceberg partitioning")
|
||||
- Over-weighted on operational toil vs. platform design or data scale
|
||||
- Vague on cost or efficiency outcomes (e.g., "improved performance" without metrics)
|
||||
|
||||
### Formatting & Layout
|
||||
Assess the rendered visual:
|
||||
- Is the layout clean and easy to scan in 30 seconds?
|
||||
- Does it fit on one page without overflow?
|
||||
- Are section headers, dates, and company names visually distinct?
|
||||
- Any alignment, spacing, or typography issues?
|
||||
|
||||
### Top 3–5 Actionable Improvements (Data/Backend Focused)
|
||||
List the highest-priority changes for data eng / backend eng hiring, ranked by impact:
|
||||
1. Add or strengthen data systems specificity (Iceberg, Trino, schema design, SLOs, lineage)
|
||||
2. Quantify orchestration scale (number of DAGs, concurrency, scheduling frequency)
|
||||
3. Highlight cost optimization or FinOps impact (% savings, compute efficiency, cost per workload)
|
||||
4. Add cluster/infrastructure scale metrics (Kubernetes pod counts, ingestion throughput, query latency)
|
||||
5. Frame architecture contributions explicitly (designed X, standardized Y, optimized Z)
|
||||
|
||||
Each item should name the bullet, specify the change, and justify why it matters for data/backend hiring.
|
||||
|
||||
---
|
||||
|
||||
**Use this prompt when you want to review resumes with the candidate's data engineering and backend engineering focus in mind.**
|
||||
4
.gitignore
vendored
4
.gitignore
vendored
@@ -16,4 +16,6 @@
|
||||
*.pdf
|
||||
|
||||
result
|
||||
.direnv
|
||||
.direnv
|
||||
|
||||
*-review.md
|
||||
94
README.md
Normal file
94
README.md
Normal file
@@ -0,0 +1,94 @@
|
||||
# Resume Repository
|
||||
|
||||
Single-source LaTeX resume project for Alice Huston, with local build tooling and automated CI workflows.
|
||||
|
||||
## Repository Layout
|
||||
|
||||
- `resume.tex`: source of truth for resume content and formatting.
|
||||
- `flake.nix`: reproducible Nix environment and PDF build derivation.
|
||||
- `.github/prompts/resume-review.prompt.md`: structured prompt used for automated review.
|
||||
- `.github/workflows/`: CI automation for build, checks, and review PRs.
|
||||
- `archive/`: historical outputs (including automated review notes).
|
||||
|
||||
## Build Locally
|
||||
|
||||
### Option 1: Nix (recommended)
|
||||
|
||||
Build the canonical artifact:
|
||||
|
||||
```bash
|
||||
nix build .#default
|
||||
```
|
||||
|
||||
Output PDF:
|
||||
|
||||
- `result/Alice_Huston_Resume_Software_Engineer.pdf`
|
||||
|
||||
### Option 2: latexmk
|
||||
|
||||
```bash
|
||||
latexmk -pdf resume.tex
|
||||
```
|
||||
|
||||
Output PDF:
|
||||
|
||||
- `resume.pdf`
|
||||
|
||||
### Option 3: pdflatex
|
||||
|
||||
```bash
|
||||
pdflatex resume.tex
|
||||
```
|
||||
|
||||
Run `pdflatex` multiple times if references/layout need to settle.
|
||||
|
||||
## Development Shell
|
||||
|
||||
Enter the Nix development shell (includes TeX tooling):
|
||||
|
||||
```bash
|
||||
nix develop
|
||||
```
|
||||
|
||||
## Clean Build Artifacts
|
||||
|
||||
```bash
|
||||
latexmk -c
|
||||
```
|
||||
|
||||
## CI / Automation
|
||||
|
||||
- `build-resume.yaml`
|
||||
- Builds resume PDF on push to `main` when `resume.tex` or workflow file changes.
|
||||
- Publishes the PDF as an artifact and creates a release.
|
||||
|
||||
- `flake-health-checks.yml`
|
||||
- Runs `nix flake check` on pushes and pull requests.
|
||||
|
||||
- `lock-health-checks.yml`
|
||||
- Validates health of `flake.lock`.
|
||||
|
||||
- `flake-update.yml`
|
||||
- Scheduled lockfile update workflow that opens a PR.
|
||||
|
||||
- `daily-resume-review.yml`
|
||||
- Runs daily and on manual dispatch.
|
||||
- Builds resume, sends prompt + source to an LLM review agent, writes review output to `archive/reviews/`.
|
||||
- Opens a PR only if the newly generated review differs from the most recent prior review.
|
||||
|
||||
## Secrets for Automation
|
||||
|
||||
Set these repository secrets for workflows that open PRs or call the model API:
|
||||
|
||||
- `GH_TOKEN_FOR_UPDATES`: token used by PR automation.
|
||||
- `OPENAI_API_KEY`: API key for daily review workflow.
|
||||
|
||||
Optional environment variables:
|
||||
|
||||
- `OPENAI_BASE_URL` (default: `https://api.openai.com/v1`)
|
||||
- `OPENAI_MODEL` (default: `gpt-5`)
|
||||
|
||||
## Notes
|
||||
|
||||
- Generated files (`*.aux`, `*.log`, `*.fdb_latexmk`, `*.fls`, `*.pdf`, etc.) are ignored by Git.
|
||||
- PDFs are configured for Git LFS via `.gitattributes`.
|
||||
BIN
archive/Alex Heifler - rs.pdf
LFS
BIN
archive/Alex Heifler - rs.pdf
LFS
Binary file not shown.
BIN
archive/Alice_Huston_Resume.pdf
LFS
BIN
archive/Alice_Huston_Resume.pdf
LFS
Binary file not shown.
Binary file not shown.
BIN
archive/Resume.pdf
LFS
BIN
archive/Resume.pdf
LFS
Binary file not shown.
BIN
archive/alicehuston_resume.pdf
LFS
BIN
archive/alicehuston_resume.pdf
LFS
Binary file not shown.
Binary file not shown.
BIN
archive/msimpkins_Resume.pdf
LFS
BIN
archive/msimpkins_Resume.pdf
LFS
Binary file not shown.
BIN
archive/skyedoto_resume.pdf
LFS
BIN
archive/skyedoto_resume.pdf
LFS
Binary file not shown.
6
flake.lock
generated
6
flake.lock
generated
@@ -20,11 +20,11 @@
|
||||
},
|
||||
"nixpkgs": {
|
||||
"locked": {
|
||||
"lastModified": 1774386573,
|
||||
"narHash": "sha256-4hAV26quOxdC6iyG7kYaZcM3VOskcPUrdCQd/nx8obc=",
|
||||
"lastModified": 1777268161,
|
||||
"narHash": "sha256-bxrdOn8SCOv8tN4JbTF/TXq7kjo9ag4M+C8yzzIRYbE=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "46db2e09e1d3f113a13c0d7b81e2f221c63b8ce9",
|
||||
"rev": "1c3fe55ad329cbcb28471bb30f05c9827f724c76",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
|
||||
175
resume.tex
175
resume.tex
@@ -116,27 +116,19 @@
|
||||
\end{multicols}
|
||||
\end{center}
|
||||
|
||||
%-----------EDUCATION-----------
|
||||
\vspace{-1.5em}
|
||||
\section{Education}
|
||||
\resumeSubHeadingListStart
|
||||
\resumeEducation
|
||||
{Stevens Institute of Technology}{Hoboken, NJ}{Aug. 2018 -- May 2022}
|
||||
{B.S. of Computer Science, Minor in Literature}{GPA: 3.34/4.0}
|
||||
\resumeSubHeadingListEnd
|
||||
|
||||
%-----------TECHNICAL SKILLS & CERTIFICATIONS-----------
|
||||
\section{Technical Skills \& Certifications}
|
||||
\vspace{-8pt}
|
||||
\vspace{-12pt}
|
||||
|
||||
\begin{multicols}{2}
|
||||
\small{
|
||||
\textbf{Languages}{: C/C++, Python, Java, Bash, Typescript} \\
|
||||
\textbf{Frameworks}{: Hadoop, Airflow, Kubernetes, Docker, OpenCV} \\
|
||||
\textbf{Databases/Lakehouses}{: Starburst, Iceberg, Hive,
|
||||
CockroachDB, OracleDB }\\
|
||||
\textbf{Languages}{: Python, Java, Nix, Bash, SQL, C/C++, TypeScript} \\
|
||||
\textbf{Frameworks}{: Hadoop, Apache Airflow, Kubernetes, Docker} \\
|
||||
\textbf{Databases/Lakehouses}{: Starburst, Databricks, Iceberg, Hive,
|
||||
CockroachDB, Oracle }\\
|
||||
\textbf{OS}{: NixOS, RHEL 8, Debian, Ubuntu Server, Windows} \\
|
||||
\textbf{Tools}{: LaTeX, Terraform, SQL, OpenGL}
|
||||
\textbf{Tools}{: LaTeX, Terraform, SQL, PyArrow, OpenGL}
|
||||
}
|
||||
|
||||
\columnbreak
|
||||
@@ -148,22 +140,34 @@
|
||||
\end{multicols}
|
||||
|
||||
%-----------EXPERIENCE-----------
|
||||
\vspace{-12pt}
|
||||
\section{Experience}
|
||||
\resumeSubHeadingListStart
|
||||
\resumeSubheading
|
||||
{Software Engineer II}{Jan. 2025 -- Present}
|
||||
{JPMorgan Chase}{Jersey City, NJ}
|
||||
\resumeItemListStart
|
||||
\resumeItem{Created configurable data ingestion framework for
|
||||
automated data refinement and movement for hybrid data lake}
|
||||
\resumeItem{Architected and implemented an orchestration system based
|
||||
on Apache airflow for both event-based and SLA-based data ingestion}
|
||||
\resumeItem{Led weekly office hour sessions to demonstrate and assist
|
||||
with onboarding and configuring new datasets}
|
||||
\resumeItem{Trained a team of developers to support the custom
|
||||
frameworks for use across multiple applications}
|
||||
\resumeItem{Created helm charts for making platform services
|
||||
available and configurable to other teams }
|
||||
\resumeItem{Designed and deployed configurable data ingestion framework
|
||||
using Iceberg CTAS and time-travel for zero-outage updates,
|
||||
orchestrating 200+ refinement pipelines with automated data
|
||||
reconciliation across four zones (OLTP, raw, trusted, refined)}
|
||||
\resumeItem{Implemented PyArrow-based validation and dual-engine
|
||||
architecture supporting on-prem (Starburst) and off-prem (Databricks)
|
||||
reporting for 50+ downstream teams}
|
||||
\resumeItem{Architected and implemented Apache Airflow orchestration
|
||||
supporting 1,000+ tasks per DAG with templated configuration-driven
|
||||
design, tiered pooling to prevent resource exhaustion, and automated
|
||||
partition registration in Trino for large Hive tables}
|
||||
\resumeItem{Led weekly office hours to help onboard new datasets and
|
||||
trained 10 developers to operate and extend the framework across
|
||||
multiple applications, reducing MTTR for incidents}
|
||||
\resumeItem{Led Kubernetes resource optimization across 30+ services in
|
||||
three applications, implementing best-effort QoS in dev and test
|
||||
environments while tuning production resources, achieving \$50k
|
||||
annual cost savings in reservations and usage}
|
||||
\resumeItem{Created reusable Helm charts and a shared service layer
|
||||
that enabled 4 platform teams to deploy and configure UI services
|
||||
more consistently}
|
||||
\resumeItemListEnd
|
||||
|
||||
%% create a new resume item below for a software engineering job at
|
||||
@@ -173,20 +177,24 @@ available and configurable to other teams }
|
||||
{Site Reliability Engineer}{Jul. 2022 -- Jan. 2025}
|
||||
{JPMorgan Chase}{Jersey City, NJ}
|
||||
\resumeItemListStart
|
||||
\resumeItem{Supported 30 applications}
|
||||
\resumeItem{SME for Hadoop data lake, means additional support for
|
||||
Tableau dashboards, Kubernetes applications, cloud foundry
|
||||
applications, maintaining dremio instance, S3 compatible object store}
|
||||
\resumeItem{SME for Linux systems, Networking (firewalls, load
|
||||
balancers, etc.), Hadoop}
|
||||
\resumeItem{Led toil reduction and noisy alert reduction across our
|
||||
applications}
|
||||
\resumeItem{Led onboarding and standardization efforts for
|
||||
observability tooling}
|
||||
\resumeItem{Fully automated disaster recovery procedures across a
|
||||
subset of our applications}
|
||||
\resumeItem{Automated copy, validation of data, and merging of across
|
||||
Hadoop lakes (72 hour effort reduction)}
|
||||
\resumeItem{Owned production support for 30 applications across
|
||||
multiple teams, including deployment approvals, incident response,
|
||||
root cause analysis, and post-mortems}
|
||||
\resumeItem{Served as primary support engineer for a Hadoop-based data
|
||||
lake platform spanning Tableau, Kubernetes, Cloud Foundry, Dremio,
|
||||
and S3-compatible object storage}
|
||||
\resumeItem{Served as the team expert on Linux, networking, and
|
||||
Hadoop infrastructure supporting business-critical applications}
|
||||
\resumeItem{Reduced toil and noisy alerts by 40\% through automated
|
||||
recovery workflows and tighter monitoring and alerting controls}
|
||||
\resumeItem{Standardized Dynatrace and Splunk onboarding across 30
|
||||
applications, improving alert coverage and observability consistency}
|
||||
\resumeItem{Automated disaster recovery procedures for a subset of
|
||||
production applications, reducing manual failover steps}
|
||||
\resumeItem{Automated historical data reload workflows using backup
|
||||
cluster for reprocessing and merge back to primary Hive datasets,
|
||||
reducing 72 hours of manual effort to zero and enabling on-demand
|
||||
backfill capabilities}
|
||||
\resumeItemListEnd
|
||||
|
||||
\resumeSubheading
|
||||
@@ -194,11 +202,11 @@ Hadoop lakes (72 hour effort reduction)}
|
||||
{Stevens Institute of Technology}{(Remote) Hoboken, NJ}
|
||||
\resumeItemListStart
|
||||
\resumeItem{Led a team of student interns to develop
|
||||
\href{https://github.com/StevensDeptECE/GrailGUI}{\textbf{Grail}}, an
|
||||
\textbf{OpenGL}-based graphics API and browser engine}
|
||||
\resumeItem{Ported \textbf{C++} networking functionality on
|
||||
\textbf{Linux} to \textbf{Windows} using \textbf{Winsock}}
|
||||
\resumeItem{Added support for \textbf{ESRI Shapefiles} to draw and
|
||||
\href{https://github.com/StevensDeptECE/GrailGUI}{Grail}, an
|
||||
OpenGL-based graphics API and browser engine}
|
||||
\resumeItem{Ported C++ networking functionality on
|
||||
Linux to Windows using Winsock}
|
||||
\resumeItem{Added support for ESRI Shapefiles to draw and
|
||||
animate maps through rendering engine}
|
||||
\resumeItem{Improved XDL Type system, a custom standard similar to
|
||||
CORBA, to send and receive statically-typed data}
|
||||
@@ -224,10 +232,10 @@ CORBA, to send and receive statically-typed data}
|
||||
{Maritime Security Center}{Hoboken, NJ}
|
||||
\resumeItemListStart
|
||||
\resumeItem{Created an image classification system with
|
||||
\textbf{OpenCV} to filter out noise and detect buoys in a
|
||||
\textbf{ROS/Gazebo} simulation}
|
||||
OpenCV to filter out noise and detect buoys in a
|
||||
ROS/Gazebo simulation}
|
||||
\resumeItem{Added mapping functionality to plot obstacles onto a 2D
|
||||
map generated by \textbf{OctoMap}}
|
||||
map generated by OctoMap}
|
||||
\resumeItem{Optimized the image classification and mapping frameworks
|
||||
to improve reliability in navigation}
|
||||
\resumeItemListEnd
|
||||
@@ -249,42 +257,51 @@ to improve reliability in navigation}
|
||||
|
||||
% Certifications moved to Technical Skills & Certifications section above
|
||||
|
||||
\section{Projects}
|
||||
\resumeSubHeadingListStart
|
||||
\resumeProjectHeading{SwitchForward}{Jun. 2020 -- Aug. 2020}
|
||||
\resumeItemListStart
|
||||
\resumeItem{A \textbf{Python}-based Telegram bot to send stock
|
||||
updates for the Nintendo Switch during a supply shortage}
|
||||
\resumeItem{Used the Gmail API to receive and parse emails from a
|
||||
Google Group tracking Nintendo Switch stock}
|
||||
\resumeItem{Sent updates to a Telegram announcements channel used by
|
||||
\textbf{5-10} users}
|
||||
\resumeItemListEnd
|
||||
|
||||
\resumeProjectHeading{Autonomous Robot}{Aug. 2018 -- Dec. 2018}
|
||||
\resumeItemListStart
|
||||
\resumeItem{An \textbf{Arduino}-based robot designed to navigate
|
||||
through a maze}
|
||||
\resumeItem{Primarily worked on pathplanning and control in a dynamic setting}
|
||||
\resumeItem{Implemented basic error-correction to account for drift
|
||||
during navigation}
|
||||
\resumeItemListEnd
|
||||
|
||||
% Removing this project as it is not as relevant to the software
|
||||
% engineering positions I am applying for
|
||||
% and the work was not as technical as my other experiences
|
||||
%\resumeProjectHeading{Cost-effective Road Anomaly Locator}{Sep. 2016
|
||||
% -- May. 2018}
|
||||
%\section{Projects}
|
||||
%\resumeSubHeadingListStart
|
||||
%\resumeProjectHeading{SwitchForward}{Jun. 2020 -- Aug. 2020}
|
||||
%\resumeItemListStart
|
||||
%\resumeItem{Designed an affordable methodology for implementing and
|
||||
% monitoring a unit to detect potholes and other damaging road
|
||||
% anomalies with \textbf{65\%} accuracy ($p<0.05$)}
|
||||
%\resumeItem{Assembled and tested units for to collect data and
|
||||
% demonstrate effectiveness of the unit}
|
||||
%\resumeItem{Ran several tests and did statistical analysis on the
|
||||
% resulting data}
|
||||
%\resumeItem{A \textbf{Python}-based Telegram bot to send stock
|
||||
% updates for the Nintendo Switch during a supply shortage}
|
||||
%\resumeItem{Used the Gmail API to receive and parse emails from a
|
||||
% Google Group tracking Nintendo Switch stock}
|
||||
%\resumeItem{Sent updates to a Telegram announcements channel used by
|
||||
% \textbf{5-10} users}
|
||||
%\resumeItemListEnd
|
||||
%
|
||||
%\resumeProjectHeading{Autonomous Robot}{Aug. 2018 -- Dec. 2018}
|
||||
%\resumeItemListStart
|
||||
%\resumeItem{An \textbf{Arduino}-based robot designed to navigate
|
||||
% through a maze}
|
||||
%\resumeItem{Primarily worked on pathplanning and control in a dynamic setting}
|
||||
%\resumeItem{Implemented basic error-correction to account for drift
|
||||
% during navigation}
|
||||
%\resumeItemListEnd
|
||||
%
|
||||
%% Removing this project as it is not as relevant to the software
|
||||
%% engineering positions I am applying for
|
||||
%% and the work was not as technical as my other experiences
|
||||
%%\resumeProjectHeading{Cost-effective Road Anomaly Locator}{Sep. 2016
|
||||
%% -- May. 2018}
|
||||
%%\resumeItemListStart
|
||||
%%\resumeItem{Designed an affordable methodology for implementing and
|
||||
%% monitoring a unit to detect potholes and other damaging road
|
||||
%% anomalies with \textbf{65\%} accuracy ($p<0.05$)}
|
||||
%%\resumeItem{Assembled and tested units for to collect data and
|
||||
%% demonstrate effectiveness of the unit}
|
||||
%%\resumeItem{Ran several tests and did statistical analysis on the
|
||||
%% resulting data}
|
||||
%%\resumeItemListEnd
|
||||
|
||||
%\resumeSubHeadingListEnd
|
||||
|
||||
%-----------EDUCATION-----------
|
||||
\vspace{-5pt}
|
||||
\section{Education}
|
||||
\resumeSubHeadingListStart
|
||||
\resumeEducation
|
||||
{Stevens Institute of Technology}{Hoboken, NJ}{Aug. 2018 -- May 2022}
|
||||
{B.S. of Computer Science, Minor in Literature}{GPA: 3.34/4.0}
|
||||
\resumeSubHeadingListEnd
|
||||
|
||||
\end{document}
|
||||
|
||||
Reference in New Issue
Block a user