June 18, 2025
Industry-backed group calls for support of AI ‘distillation’ practice in federal R&D plan
The federal government should fund research on “distillation,” a controversial form of artificial intelligence training, given the industry’s increasing demands for computation resources, according to the Chamber of Progress.
OpenAI, along with Trump administration officials like AI czar David Sacks, have accused DeepSeek of using distillation -- the process of training an AI model by having it ask questions of an existing, stronger, AI model -- to “copy” the U.S. company’s work, in a violation of its service terms....