5 Commits

Author SHA1 Message Date
dcfc1ac313 education updates
All checks were successful
Build and Release Resume PDF / date-fetch (push) Successful in 2s
Check flake.lock / Check health of `flake.lock` (push) Successful in 7s
Check Nix flake / Perform Nix flake checks (push) Successful in 57s
Build and Release Resume PDF / build (push) Successful in 4m3s
2026-04-30 14:51:53 -04:00
f02a1764e7 fix spacing
All checks were successful
Build and Release Resume PDF / date-fetch (push) Successful in 2s
Check flake.lock / Check health of `flake.lock` (push) Successful in 15s
Check Nix flake / Perform Nix flake checks (push) Successful in 1m4s
Build and Release Resume PDF / build (push) Successful in 2m58s
2026-04-30 14:48:13 -04:00
93ef40e0c0 move education to the bottom
All checks were successful
Build and Release Resume PDF / date-fetch (push) Successful in 4s
Check flake.lock / Check health of `flake.lock` (push) Successful in 7s
Check Nix flake / Perform Nix flake checks (push) Successful in 2m0s
Build and Release Resume PDF / build (push) Successful in 4m29s
2026-04-30 14:35:02 -04:00
0bed3433dd lang fix
All checks were successful
Build and Release Resume PDF / date-fetch (push) Successful in 2s
Check flake.lock / Check health of `flake.lock` (push) Successful in 7s
Check Nix flake / Perform Nix flake checks (push) Successful in 41s
Build and Release Resume PDF / build (push) Successful in 1m24s
2026-04-30 14:21:39 -04:00
46094645b1 format, flake update
All checks were successful
Build and Release Resume PDF / date-fetch (push) Successful in 3s
Check flake.lock / Check health of `flake.lock` (push) Successful in 15s
Check Nix flake / Perform Nix flake checks (push) Successful in 44s
Build and Release Resume PDF / build (push) Successful in 1m27s
2026-04-30 14:09:34 -04:00
2 changed files with 29 additions and 27 deletions

6
flake.lock generated
View File

@@ -20,11 +20,11 @@
}, },
"nixpkgs": { "nixpkgs": {
"locked": { "locked": {
"lastModified": 1774386573, "lastModified": 1777268161,
"narHash": "sha256-4hAV26quOxdC6iyG7kYaZcM3VOskcPUrdCQd/nx8obc=", "narHash": "sha256-bxrdOn8SCOv8tN4JbTF/TXq7kjo9ag4M+C8yzzIRYbE=",
"owner": "NixOS", "owner": "NixOS",
"repo": "nixpkgs", "repo": "nixpkgs",
"rev": "46db2e09e1d3f113a13c0d7b81e2f221c63b8ce9", "rev": "1c3fe55ad329cbcb28471bb30f05c9827f724c76",
"type": "github" "type": "github"
}, },
"original": { "original": {

View File

@@ -116,25 +116,17 @@
\end{multicols} \end{multicols}
\end{center} \end{center}
%-----------EDUCATION-----------
\vspace{-1.5em}
\section{Education}
\resumeSubHeadingListStart
\resumeEducation
{Stevens Institute of Technology}{Hoboken, NJ}{Aug. 2018 -- May 2022}
{B.S. of Computer Science, Minor in Literature}{GPA: 3.34/4.0}
\resumeSubHeadingListEnd
%-----------TECHNICAL SKILLS & CERTIFICATIONS----------- %-----------TECHNICAL SKILLS & CERTIFICATIONS-----------
\section{Technical Skills \& Certifications} \section{Technical Skills \& Certifications}
\vspace{-8pt} \vspace{-12pt}
\begin{multicols}{2} \begin{multicols}{2}
\small{ \small{
\textbf{Languages}{: C/C++, Python, Java, Bash, Typescript} \\ \textbf{Languages}{: Python, Java, Nix, Bash, SQL, C/C++, TypeScript} \\
\textbf{Frameworks}{: Hadoop, Apache Airflow, Kubernetes, Docker} \\ \textbf{Frameworks}{: Hadoop, Apache Airflow, Kubernetes, Docker} \\
\textbf{Databases/Lakehouses}{: Starburst, Databricks, Iceberg, Hive, \textbf{Databases/Lakehouses}{: Starburst, Databricks, Iceberg, Hive,
CockroachDB, OracleDB }\\ CockroachDB, Oracle }\\
\textbf{OS}{: NixOS, RHEL 8, Debian, Ubuntu Server, Windows} \\ \textbf{OS}{: NixOS, RHEL 8, Debian, Ubuntu Server, Windows} \\
\textbf{Tools}{: LaTeX, Terraform, SQL, PyArrow, OpenGL} \textbf{Tools}{: LaTeX, Terraform, SQL, PyArrow, OpenGL}
} }
@@ -148,6 +140,7 @@
\end{multicols} \end{multicols}
%-----------EXPERIENCE----------- %-----------EXPERIENCE-----------
\vspace{-12pt}
\section{Experience} \section{Experience}
\resumeSubHeadingListStart \resumeSubHeadingListStart
\resumeSubheading \resumeSubheading
@@ -155,25 +148,25 @@
{JPMorgan Chase}{Jersey City, NJ} {JPMorgan Chase}{Jersey City, NJ}
\resumeItemListStart \resumeItemListStart
\resumeItem{Designed and deployed configurable data ingestion framework \resumeItem{Designed and deployed configurable data ingestion framework
using Iceberg CTAS and time-travel for zero-outage updates, using Iceberg CTAS and time-travel for zero-outage updates,
orchestrating 200+ refinement pipelines with automated data orchestrating 200+ refinement pipelines with automated data
reconciliation across four zones (OLTP, raw, trusted, refined)} reconciliation across four zones (OLTP, raw, trusted, refined)}
\resumeItem{Implemented PyArrow-based validation and dual-engine \resumeItem{Implemented PyArrow-based validation and dual-engine
architecture supporting on-prem (Starburst) and off-prem (Databricks) architecture supporting on-prem (Starburst) and off-prem (Databricks)
reporting for 50+ downstream teams} reporting for 50+ downstream teams}
\resumeItem{Architected and implemented Apache Airflow orchestration \resumeItem{Architected and implemented Apache Airflow orchestration
supporting 1,000+ tasks per DAG with templated configuration-driven supporting 1,000+ tasks per DAG with templated configuration-driven
design, tiered pooling to prevent resource exhaustion, and automated design, tiered pooling to prevent resource exhaustion, and automated
partition registration in Trino for large Hive tables} partition registration in Trino for large Hive tables}
\resumeItem{Led weekly office hours to help onboard new datasets and \resumeItem{Led weekly office hours to help onboard new datasets and
trained 10 developers to operate and extend the framework across trained 10 developers to operate and extend the framework across
multiple applications, reducing MTTR for incidents} multiple applications, reducing MTTR for incidents}
\resumeItem{Led Kubernetes resource optimization across 30+ services in \resumeItem{Led Kubernetes resource optimization across 30+ services in
three applications, implementing best-effort QoS in dev and test three applications, implementing best-effort QoS in dev and test
environments while tuning production resources, achieving \$50k environments while tuning production resources, achieving \$50k
annual cost savings in reservations and usage} annual cost savings in reservations and usage}
\resumeItem{Created reusable Helm charts and a shared service layer \resumeItem{Created reusable Helm charts and a shared service layer
that enabled 4 platform teams to deploy and configure UI services that enabled 4 platform teams to deploy and configure UI services
more consistently} more consistently}
\resumeItemListEnd \resumeItemListEnd
@@ -185,10 +178,10 @@ more consistently}
{JPMorgan Chase}{Jersey City, NJ} {JPMorgan Chase}{Jersey City, NJ}
\resumeItemListStart \resumeItemListStart
\resumeItem{Owned production support for 30 applications across \resumeItem{Owned production support for 30 applications across
multiple teams, including deployment approvals, incident response, multiple teams, including deployment approvals, incident response,
root cause analysis, and post-mortems} root cause analysis, and post-mortems}
\resumeItem{Served as primary support engineer for a Hadoop-based data \resumeItem{Served as primary support engineer for a Hadoop-based data
lake platform spanning Tableau, Kubernetes, Cloud Foundry, Dremio, lake platform spanning Tableau, Kubernetes, Cloud Foundry, Dremio,
and S3-compatible object storage} and S3-compatible object storage}
\resumeItem{Served as the team expert on Linux, networking, and \resumeItem{Served as the team expert on Linux, networking, and
Hadoop infrastructure supporting business-critical applications} Hadoop infrastructure supporting business-critical applications}
@@ -199,8 +192,8 @@ applications, improving alert coverage and observability consistency}
\resumeItem{Automated disaster recovery procedures for a subset of \resumeItem{Automated disaster recovery procedures for a subset of
production applications, reducing manual failover steps} production applications, reducing manual failover steps}
\resumeItem{Automated historical data reload workflows using backup \resumeItem{Automated historical data reload workflows using backup
cluster for reprocessing and merge back to primary Hive datasets, cluster for reprocessing and merge back to primary Hive datasets,
reducing 72 hours of manual effort to zero and enabling on-demand reducing 72 hours of manual effort to zero and enabling on-demand
backfill capabilities} backfill capabilities}
\resumeItemListEnd \resumeItemListEnd
@@ -302,4 +295,13 @@ to improve reliability in navigation}
%\resumeSubHeadingListEnd %\resumeSubHeadingListEnd
%-----------EDUCATION-----------
\vspace{-5pt}
\section{Education}
\resumeSubHeadingListStart
\resumeEducation
{Stevens Institute of Technology}{Hoboken, NJ}{Aug. 2018 -- May 2022}
{B.S. of Computer Science, Minor in Literature}{}%GPA: 3.34/4.0}
\resumeSubHeadingListEnd
\end{document} \end{document}