Tumgik
#gpipe
eccentric-nucleus · 5 months
Note
Not sure if there's a good way to phrase this since I get annoyed at a lot of these kinds of question, but
Is there a particular reason that you go for OpenGL + Haskell? My heart tells me that those are the worst possible fit (procedural API over the top of a big hidden state machine w/ soft real-time requirements vs a runtime that wants to do pure functions and lazy evaluation). That said, you seem to get to interesting places with some regularity whereas my projects (C/C++ and vulkan usually) tend to die before I get to the cool part, so I'm wondering if there's something to it
i just got frustrated with c-alikes and i really enjoyed aspects of haskell coding. it is objectively a very silly combination, although not as silly as it has been historically given the various improvements in haskell gc over the years.
historically i've used gpipe for haskell rendering, which does some astounding type family wizardry to basically fully-hide the opengl state machine and also let you write shaders in actual haskell (values in the shader monad are actually part of a compositional scripting type that evaluates to glsl code. it's wild.) so it's about as close as you can get to totally ignoring all the opengl-ness of opengl. that being said, uh, that has some problems (zero memoization in generated scripts; very unclear surfacing of real opengl constraints)
also to be fair my projects also tend to die before i get to the cool part, it's just sometimes i manage to get some neat renders out there before.
(right now i've been wanting to jettison the gpipe library in favor of just doing raw opengl right in IO, mostly so i can actually use opengl 4 features that aren't surfaced either in gpipe nor in the OpenGL package, but ofc the first step there would be a whole bunch of low-level data fiddling. but since i've been doing webgl2 + javascript also, uh, the opengl part would mostly end up being exactly the same, so it feels a little less intimidating now. i just really wanna write some wild shader code and to write really wild shader code you do kind of have to directly interface with raw opengl.)
2 notes · View notes
nostalgebraist · 2 years
Note
Models like GPT-3 are far too large to fit on a single GPU, and some even bigger ones struggle to fit on a single machine (say 8 x 24 GB). But training requires lots of linear algebra operations with dense tensors. How does one multiply matrices too big to fit in memory acceptably fast? Also how bad are communication overheads when the data to transfer is ~full memory? A ~textbook reference is fine
I'm far from an expert on this stuff, but basically, uh... you just split them up across devices, communicate across devices when necessary, and try to design the splitting to minimize communication.
Also, you train on clusters designed for high bandwidth (like TPU "pods").
Is there significant communication overhead? Yes, and it becomes the main constraint when training very large models.
The term for this is "model parallelism." The Megatron-LM paper is a relatively readable reference.
More recently, people talk more about something called "pipeline parallelism," introduced in a paper called GPipe. I don't really understand it but I think it's some sort of variant on the same idea.
13 notes · View notes
oyeahgifts · 6 years
Photo
Tumblr media
Roll away with O Yeah Gifts @saltydog_gpipe !!! #oyeahgifts #saltydog #surf #skate #skateboard #dog #jewelry #bracelets #beachlifebracelets #beachlife #necklace #charmnecklace #crystals #florida #ormondbeach #gpipe #local #supportlocal #surfgirl #surfer #skater #girl (at Salty Dog Surf Shop Granada) https://www.instagram.com/p/BpnDUdFAdXo/?utm_source=ig_tumblr_share&igshid=1umfqouatpln2
0 notes
boardtheory · 6 years
Video
Last night had a few gems :) stoked to get some with z homies ... hope y’all are enjoying. 5’0” twin fish fry unda my feet 🚀 thanks for shooting wade, more soon! 🖤🐿 #boardtheory #handcrafted #unconventional #surfing #gpipe #fishes #skatepark #spinning #chinaboardssuck #ormondbeach #daytonabeach #madeinflorida #surferbuilt #surfershaper (at Ormond Beach, Florida) https://www.instagram.com/p/Bnoc4L1jmeb/?utm_source=ig_tumblr_share&igshid=1i1794l4tg6na
0 notes
hackernewsrobot · 5 years
Text
Google open sourced GPipe, a library for training large ML models
https://ai.googleblog.com/2019/03/introducing-gpipe-open-source-library.html Comments
2 notes · View notes
tensorflow4u · 6 years
Photo
Tumblr media
#CloudTPU + TensorFlow train highest-accuracy models on ImageNet-2012 (84.3%) / CIFAR-10 (99%) / CIFAR-100 (91.3%) trained with public data. The GPipe library empowers efficient training of large models using pipeline parallelism. Learn more here → https://t.co/AE97891dcf https://t.co/JGqHwEelq0
4 notes · View notes
insomniac-isotope · 2 years
Photo
Tumblr media Tumblr media Tumblr media Tumblr media
One fun thing about opengl is that I had full access to a depth buffer, and so I could actually correctly place everything on the z-axis. Actually doing that in practice meant writing some custom shader code and figuring out all the relevant math, which wasn't entirely trivial.
This has actually been my first time directly writing GLSL code with the raw opengl pipeline. The prior 3d stuff I posted was all either done with ancient versions of opengl using the fixed-function pipeline, or done with gpipe, which abstracts away a lot of the shader specifics. Here, I was just directly given the problem of needing to output depth-placed images within the opengl clip cube with no other considerations, and I got to work. The current shader is entirely '2d' in that there's not even an isometric camera matrix -- all coordinates are calculated to screen pixel before being handed to the shader, and associated with a precomputed 'depth' value.
In the future if I want to do anything particularly fancy with shaders I'll probably have to change that to manage things in world-space coordinates with an actual camera transform uniform, but for now this works fine. Well. After I worked out all these bugs.
1 note · View note
fyrecell · 5 years
Photo
Tumblr media
I should use Gpipe to build a porn site with the PornHub community of sites that converts potential porn to CGI. https://www.instagram.com/p/Buo-FPIAQ1e/?utm_source=ig_tumblr_share&igshid=1e9r0iw6xao31
0 notes
eccentric-nucleus · 6 months
Text
i think for 2024 i'm gonna try actually working on a game project again. we'll see how that goes b/c also what i'd like to do is pull out gpipe and replace it with OpenGLRaw now that i've gotten a little more used to actual opengl code & writing shaders. doing all that by hand is probably gonna be less messy than trusting any shader code autogeneration layer.
also i mean i still have the hell game 2 demo that i could work on but i don't even know what i'd want to do with that.
1 note · View note
amazingvideosposts · 5 years
Photo
Tumblr media
New top story on Hacker News: Google open sourced GPipe, a library for training large ML models https://ift.tt/2IQUDDa
0 notes
metabloks · 5 years
Photo
Tumblr media
Google open-sources GPipe, a library for efficiently training large deep neural networks https://ift.tt/2SJSZTg
0 notes
insomniac-isotope · 6 years
Photo
Tumblr media Tumblr media
I'm trying out more 3D stuff. Currently I'm just getting used to the new framework I'm using (gpipe) and trying to extend the basic rendering code I've been using for rendering plants/houses/etc into the world of shaders and textures and text. The text isn't working quite right yet.
1 note · View note
unctech-blog · 5 years
Text
Google open-sources GPipe, a library for efficiently training large deep neural networks
For More Visit : Read More Best Tech Blog : Tech-on-news.blogspot.com
https://ift.tt/2IPPVoZ
0 notes
suwa-sh · 5 years
Text
BigGAN: Class Conditionalな高解像度画像生成 SOTA手法をスケールアップしたら綺麗にできました GPipe: 巨大なNNの学習に最適化された分散学習ライブラリ NNでNNを探す 多数繰り返したら強くなった #TechOn東京
BigGAN: Class Conditionalな高解像度画像生成 SOTA手法をスケールアップしたら綺麗にできました GPipe: 巨大なNNの学習に最適化された分散学習ライブラリ NNでNNを探す 多数繰り返したら強くなった#TechOn東京
— 諏訪真一 (@suwa_sh) May 13, 2019
via Twitter https://twitter.com/suwa_sh May 13, 2019 at 08:39PM
0 notes
eurekakinginc · 5 years
Link
Posted by WebHostingSaver via /r/artificial. Join Discussion: https://ift.tt/2UvKOLZ. Curated by: www.eurekaking.com
0 notes
ericvanderburg · 5 years
Text
Google Open-Sources GPipe, a Library For Training Large Deep Neural Networks
http://i.securitythinkingcap.com/R0FF52
0 notes