2023 Q1
AI Alignment - Can Rager - Code - Prepared for and conducted the coding test for the ARENA programme
AI Alignment - Can Rager - Other writing - Report - Developed a research proposal
AI Alignment - Guillaume Corlouer - Course - Got selected at the PIBBSS Fellowship
AI Alignment - Hamish Huggard - Project - Video on AI safety
AI Alignment - Hamish Huggard - Project - Animated the AXRP video "How do neural networks do modular addition?"
AI Alignment - Hamish Huggard - Project - Animated the AXRP video "Vanessa Kosoy on the Monotonicity Principle"
AI Alignment - Hamish Huggard - Project - Animated the AXRP video "What is mechanistic interpretability? Neel Nanda explains"
AI Alignment - Jaeson Booker - Other writing - Book Chapter - Completed authoring two-third of the AI Safety handbook
AI Alignment - Jaeson Booker - Project - Created AI Safety Strategy Group' Website
AI Alignment - Michele Campolo - Post (EA/LW/AF) - On value in humans, other animals, and AI
AI Alignment - Vinay Hiremath - Placement - Job - Selected as a Teaching Assistant for the ML for Alignment Bootcamp
Meta or Community Building - Can Rager - Event - AI Safety - Conducted online training for software engineering and mechanistic interpretability
Meta or Community Building - Chris Leong - Project - Worked together with Abram Demski on an Adversarial Collaboration project
Meta or Community Building - Hamish Huggard - Project - Built and launched https://aisafety.world/
Meta or Community Building - Hamish Huggard - Project - Video to help an EANZ fundraiser for GiveDirectly
Meta or Community Building - Jaeson Booker - Project - Created AI Safety Strategy Group and GitHub Project Board
2023 Q2
AI Alignment - Can Rager - Post (EA/LW/AF) - Written a post on Understanding mesa-optimisation using toy models (37)
AI Alignment - Chris Leong - Course - Got accepted to the online training + research sprint component of John Wentworth's SERI MATS stream
AI Alignment - Jaeson Booker - Course - Got accepted to the online training + research sprint component of John Wentworth's SERI MATS stream
AI Alignment - Jaeson Booker - Other writing - Review - Reviewed Policy Report to AIS-safety
AI Alignment - Michele Campolo - Course - Got accepted to the online training + research sprint component of John Wentworth's SERI MATS stream
AI Alignment - Nia Gardner - Event - Organisation - Organised an ML Bootcamp in Germany
Animal Welfare - Ramika Prajapati - Project - Signed a contract to author a Scientific Policy Report for Rethink Priorities' Insect Welfare Institute, to be presented to the UK Government in 2023
Meta or Community Building - Daniela Tiznado - Project - 3 EA women collaborated at CEEALAR to co-work on building a global support network for EA Women [10%]
Meta or Community Building - [multiple people] - Event - Retreat - Hosted ALLFED team retreat
Meta or Community Building - Onicah Ntswejakgosi - Project - 3 EA women collaborated at CEEALAR to co-work on building a global support network for EA Women [10%]
Meta or Community Building - Ramika Prajapati - Project - 3 EA women collaborated at CEEALAR to co-work on building a global support network for EA Women [10%]
X-Risks - Jaeson Booker - Other writing - Blog - Moloch, Disaster Monkeys, and Gremlins- Blog Post to AI-safety, X-risks
X-Risks - Jaeson Booker - Other writing - Blog - The Amoral God-King- Blog Post to AI-safety, X-risks
X-Risks - Jaeson Booker - Other writing - Blog - The Three Heads of the Dragon- Blog Post to AI-safety, X-risks