framapiaf.org est l'un des nombreux serveurs Mastodon indépendants que vous pouvez utiliser pour participer au fédiverse.
Un service Mastodon fourni par l'association d’éducation populaire Framasoft.

Administré par :

Statistiques du serveur :

1,4K
comptes actifs

#google

302 messages250 participants33 messages aujourd’hui

✉️ #Google найближчим часом планує впровадити наскрізне шифрування (#E2EE) для всіх користувачів.

Компанія запевняє, що нове шифрування не ускладнить життя користувачів та не створить надмірного навантаження на IT-адміністраторів.

В Google запевняють, що весь процес можна порівняти з наданням доступу до документа Workspace комусь за межами компанії.

У випадках, коли листи E2EE надсилаються одержувачам, у яких вже налаштовано S/MIME, зашифрований електронний лист надійде одержувачу як завжди.

#Google #AI researchers were formerly like university researchers in this respect: They published their research when it was ready and without regard to corporate interests. For example, see the landmark 2017 paper introducing the transformer technology now in use by all major #LLM tools, including those from Google rivals.
arxiv.org/abs/1706.03762

More here.
en.wikipedia.org/wiki/Attentio

But that's changing. Google's AI researchers may now only publish their findings after an embargo and corporate approval.
arstechnica.com/ai/2025/04/dee

“'I cannot imagine us putting out the transformer papers for general use now,' said one current researcher…The new review processes [has] contributed to some departures. 'If you can’t publish, it’s a career killer if you’re a researcher,' said a former researcher."

arXiv logo
arXiv.orgAttention Is All You NeedThe dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely. Experiments on two machine translation tasks show these models to be superior in quality while being more parallelizable and requiring significantly less time to train. Our model achieves 28.4 BLEU on the WMT 2014 English-to-German translation task, improving over the existing best results, including ensembles by over 2 BLEU. On the WMT 2014 English-to-French translation task, our model establishes a new single-model state-of-the-art BLEU score of 41.8 after training for 3.5 days on eight GPUs, a small fraction of the training costs of the best models from the literature. We show that the Transformer generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.

Request for DeGoogling advice: I have a Google workspace account for just me. I'm the workspace admin and I'm the only user. Everything is currently under my own custom domain. I know that I can download and archive everything but I want to take it more slowly.

If I cancel Workspace, will I still have access to the emails, Drive files, and photos under a free @google.com account or will I loose access to all of it unless I download?

Thanks