英文字典中文字典


英文字典中文字典51ZiDian.com



中文字典辞典   英文字典 a   b   c   d   e   f   g   h   i   j   k   l   m   n   o   p   q   r   s   t   u   v   w   x   y   z       







请输入英文单字,中文词皆可:

forgo    音标拼音: [fɔrg'o]
vt. 摒绝,放弃

摒绝,放弃

forgo
v 1: do without or cease to hold or adhere to; "We are
dispensing with formalities"; "relinquish the old ideas"
[synonym: {waive}, {relinquish}, {forgo}, {forego},
{foreswear}, {dispense with}]
2: be earlier in time; go back further; "Stone tools precede
bronze tools" [synonym: {predate}, {precede}, {forego}, {forgo},
{antecede}, {antedate}] [ant: {follow}, {postdate}]
3: lose (s.th.) or lose the right to (s.th.) by some error,
offense, or crime; "you've forfeited your right to name your
successor"; "forfeited property" [synonym: {forfeit}, {give up},
{throw overboard}, {waive}, {forgo}, {forego}] [ant:
{arrogate}, {claim}, {lay claim}]

Forgo \For*go"\, v. t. [imp. {Forwent}; p. p. {Forgone}; p. pr.
& vb. n. {Forgoing}.] [OE. forgan, forgon, forgoon, AS.
forg[=a]n, prop., to go past, hence, to abstain from; pref.
for- g[=a]n to go; akin to G. vergehen to pass away, to
transgress. See {Go}, v. i.]
1. To pass by; to leave. See 1st {Forego}.
[1913 Webster]

For sith [since] I shall forgoon my liberty
At your request. --Chaucer.
[1913 Webster]

And four [days] since Florimell the court forwent.
--Spenser.
[1913 Webster]

2. to abstain from; to do without; to refrain from; to
renounce; -- said of a thing already enjoyed, or of one
within reach, or anticipated. See 1st {forego}, 2.
[PJC]

Note: This word in spelling has been confused with, and
almost superseded by, forego to go before.
Etymologically the form forgo is correct.
[1913 Webster]

73 Moby Thesaurus words for "forgo":
abandon, abdicate, abjure, abstain, abstain from,
acknowledge defeat, avoid, cease, cede, cry quits, desist from,
disgorge, dispense with, dispose of, do without, drop, dump,
eliminate, eschew, forbear, forsake, forswear, get along without,
get rid of, give away, give over, give up, hand over,
have done with, hold, hold aloof from, hold back, hold off,
keep back, keep from, keep in hand, kiss good-bye, lay down,
leave alone, leave off, leave out, let alone, let go by,
make a sacrifice, never touch, not touch, not use, omit, part with,
pass up, quitclaim, recant, refrain, refrain from, relinquish,
render up, renounce, reserve, resign, retract, sacrifice, save,
shun, spare, stand aloof from, surrender, swear off, throw up,
turn down, vacate, waive, withhold, yield



安装中文字典英文字典查询工具!


中文字典英文字典工具:
选择颜色:
输入中英文单字

































































英文字典中文字典相关资料:


  • Optimus: Accelerating Large-Scale Multi-Modal LLM Training by Bubble . . .
    This paper proposes Optimus, a distributed MLLM training system that reduces end-to-end MLLM training time Optimus is based on our principled analysis that scheduling the encoder computation within the LLM bubbles can reduce bubbles in MLLM training
  • Optimus: Accelerating Large-Scale Multi-Modal LLM Training by Bubble . . .
    In this paper, we introduce Optimus, a distributed MLLM training system that enables scheduling encoder computa-tions within idle periods — referred to as "LLM bubbles" — to achieve efficient 3D parallelism
  • 【论文精读】Optimus Accelerating Large-Scale Multi . . .
    作者观察到在 3000 个 GPU 的集群训练多模态模型时,有 48% 的时间 GPU 都是空闲的,这些空闲时间有 90% 都是同步通信导致(下图),而 encoder 的计算相比 LLM 较少,之前的工作都没有将 encoder 和 LLM 同时考虑在一起优化,所以这篇文章的工作旨在想将多模态 encoder 的计算与 LLM 模块进行并行计算时通信重叠来提高并行效率(一句话概括文章主要想做的就是用 encoder 计算去和 LLM 并行计算中的通信重叠,达到同时利用通信和计算资源的目的来提高训练效率)。 Optimus 在 Megatron-LM 的基础上实现,在 3072 个 GPU 的集群训练 ViT-22B 和 GPT-175B 模型时提升了 20 5% - 21 3% 的速度。
  • Optimus | Proceedings of the 2025 USENIX Conference on Usenix Annual . . .
    This paper proposes Optimus, a distributed MLLM training system that reduces end-to-end MLLM training time Optimus is based on our principled analysis that scheduling the encoder computation within the LLM bubbles can reduce bubbles in MLLM training
  • Optimus: Accelerating Large-Scale Multi-Modal LLM Training . . .
    - 研究目标: 提出一个名为Optimus的分布式MLLM训练系统,通过有效利用这些GPU空闲时间来减少端到端的训练时间。 核心思想是在语言模型(LLM)计算过程中产生的“气泡”内调度编码器(encoder)的计算任务。
  • Optimus: Accelerating Large-Scale Multi-Modal LLM Training . . .
    本文提出了Optimus,一种分布式MLLM训练系统,可以缩短端到端的MLLM训练时间。 Optimus基于我们的原则性分析,即在LLM泡沫中调度编码器计算可以减少MLLM训练中的泡沫。
  • 多模态大语言模型arxiv论文略读(105) - CSDN博客
    为了提高训练效率,研究团队提出了Optimus系统,通过在LLM空闲时间内调度编码器计算,减少GPU空闲时间,从而加速MLLMs的训练。 编码器和LLM的独立并行计划:每个GPU同时持有编码器和LLM的模型状态,确保所有GPU都能在LLM空闲时间内执行编码器计算。 双阶段依赖管理:通过局部调度和全局排序来处理MLLM训练中的复杂依赖关系,确保编码器和LLM之间的微批次级依赖。 内核级调度:将编码器层分解为内核,利用亚毫秒级的空闲时间,同时在LLM计算期间调度编码器通信内核,以减少迭代时间。 ️ 实验设计:研究团队在多个代表性的MLLM模型上进行了实验,包括ViT-22B和GPT-175B模型,使用超过3072个GPU。
  • Optimus: Accelerating Large-Scale Multi-Modal LLM Training by Bubble . . .
    Optimus is a distributed MLLM training system that reduces end-to-end MLLM training time and adopts a bubble scheduling algorithm to enable exploiting LLM bubbles without breaking the original data dependencies in the MLLM model architecture
  • Optimus: Accelerating Large-Scale Multi-Modal LLM Training by Bubble . . .
    This paper proposes Optimus, a distributed MLLM training system that reduces end-to-end MLLM training time Optimus is based on our principled analysis that scheduling the encoder computation within the LLM bubbles can reduce bubbles in MLLM training





中文字典-英文字典  2005-2009