Single Parent Family
Research on "single parent family" (as inferred from the provided unrelated titles and abstracts) focuses on developing and improving large language models (LLMs) across various modalities (text, image, audio, video) and for specialized domains (finance, healthcare, code). Current efforts concentrate on creating more efficient and powerful models through techniques like progressive low-rank decomposition, instruction tuning, and advanced training pipelines, often resulting in families of models spanning different parameter scales. This work aims to enhance the capabilities and accessibility of LLMs, impacting fields ranging from financial technology and healthcare to software development and creative arts therapies.
Papers
Open Sentence Embeddings for Portuguese with the Serafim PT* encoders family
Luís Gomes, António Branco, João Silva, João Rodrigues, Rodrigo Santos
FINER++: Building a Family of Variable-periodic Functions for Activating Implicit Neural Representation
Hao Zhu, Zhen Liu, Qi Zhang, Jingde Fu, Weibing Deng, Zhan Ma, Yanwen Guo, Xun Cao