Generative AI now serves a large number of users daily but comes with significant energy costs, which may contribute to unequal access. Training and inference at scale consume substantial electricity, and research suggests that more efficient model design can reduce energy use significantly. This event highlights resilient, resource-efficient AI leveraging lightweight models, compression, and optimization to reduce energy demand, broaden access, address emerging energy bottlenecks, and support sustainable AI deployment worldwide.