It's more expensive than the Alienware, but way cheaper than say... a car. Or a house. So that's a great deal!
Not much range in there
Who buys the qd oled alienware when shipping is in August oktober💀
Huh… buying one now then
Gotta love the sponsored videos...
The editing on the fallen cable was pure gold lol
@M4rio21 that and the ??? on the mystery I/O port. Still never found out what that was
Hey guys. - Linus. I have a great idea. buy a nuc 12 with lowest spec, put a 12900k into it undervolt it , power limit it , dissassamle the whole think, keep only the cumpute unit - motherboard thingie and the 2 pcie 16x base board, buy an rtx 3090 turbo 2 slot beast, change the vram pads on it, use higher quality thermal paste, change the nuc 12s base sfx power supply to a 700w flex - atx, and 3d print 3 houses for this pc hardware for 3 different fromfactors or usecases, 1. a mac mini ultra compatitor, 2. a pc all in on back of a 1440p 240-270hz 27 inch or 4k 10hz 10 biit hdr monitor, and 3. make the most powerful little bit thick 16-20 inch fastest gamer laptop. please use the lightest materuial as possible like carbon fiber printing, or stufff, both for the monior to it be able to be easily portable and also change out monitors metal mothertbard base plate to carbon fiber or some hard plastic, nobody makes thing like this but these are very good projects, if i had the time and money, i would make soething purchasable from these projects. let see what it would look like. monitors i refrred to currently are acer 12440p 270hz alienware 27 inch 240hz and lg 4k 160hz, but possibly mini led in the future if possibly and also on the laptop side, i want to see bigger, up to 21 inch laptops maybe bit lower, with same quality screenss as the bigger ones, and oh also, the possibility for useing a touchscreen panel with pen support. What are your opinions?
Just missed the WASTED text.
@M4rio21 no it is pure copper 🤣
I agree. Give this editor a raise.
"costs a bit more than the Alienware" Bruh, it's twice the price.
He thought it was 1300 I think
@e21big But OLED has true blacks so better contrast. There is a reason OLED's have seperate HDR-specifications. Not saying these QD-OLED's are better (they still have the color-fringing problem and are less bright), but both have their ups and downs.
You can buy a 48” LG C1 and the Alienware QD-OLED monitor for the price of this one monitor. It seems to be great for a hybrid workload, but if you’re gaming, the Alienware makes more sense.
@Lead Foot is it? QD-OLED can do 200 nit full screen white frame so 5x is.. 1000 nit!? I have a hard time believing that, even a 2000 nit mini-LED can't do sustain 1000 nit full frame, it usually 600-700 nits at most
For such a massive power brick, I was expecting 100W USB-C power delivery
It is so they can charge you more for a needlessly huge power brick you don't need but not include actual useful feature to save even more money
Agreed. The brick can supply something like 330W of power, and the monitor itself barely consumes a quarter of that.
for my laptop it is a half of the amount it needs ... quite useless for me ))))
Seems like a great option for hybrid use. Honestly, I am pretty happy with my new monitor, the Acer ConceptD CP5. It has almost no media presence in the form of reviews and I got it as b-stock for 439€ and it has 170Hz, 2560x1440 IPS, 600nits, factory colour calibration and 10bit FRC. As a gamer and hobby photographer with not that much experience, this is an awesome middle ground.
@gamebuster800 yes that's a great display. If I have had more money to spend and a better graphics card than a GTX 1070, this would have been my choice. But Cyberpunk on 1440p with a 1070 is already a challenge :)
I got the CP7 (4K 144Hz G-Sync ultimate HDR1000 local dimming) and I got it for 1600 EUR. I've never heard of the display before I got it, but it's great!
Nice find, thanks!
Looks like a nice display, but the lack of HDMI 2.1 is completely unacceptable at this point. Nvidia needs to update their controller to support that ASAP.
HDMI 2.0 on a monitor this expensive is inexcusable. End of discussion. Full 2.1 should be the standard for any monitor claiming to be for gaming.
@Zhyvora Id rather it just be freesync premium at that point then. Should still work fine with ANY gpu, and take some off the price
@Zhyvora yeah to be fair it is definitely a nvidia issue as opposed to the gsync integrators. Nvidia being even closer to the Apple comparison than anyone else of course.
@unQ Honestly, only way is to remove G Sync module to support HDMI 2.1, so at that point it's up to whoever is making the monitor. So you either get full g-sync ultimate compatibility, or 2.1.
@Zhyvora tbh we could hypothesise different use cases all day but i don’t think it excuses a 2.5 grand monitor having an outdated HDMI spec in 2022. Only anti consumer companies do that kinda stuff cough apple cough, and unfortunately we still buy their crap. As i type this on an iphone 13 pro max 🤒🤒
@unQ If people can afford this monitor you'd bet they won't be on a console on this. They'd have their console on a dedicated OLED TV in a media room that's either combined with their PC or they have a dedicated PC room/office where this would be at. Money at this level is literally incomprehensible for a lot of people.
You keep saying you don't want the gap between the plastic and the active display area, but it's part of the manufacturing process. To go right to the edge of the display you'd have to spend out the nose on really tight manufacturing tolerances or allow pixels to go *under* the bezel, and just have a small variation in the number of vertical pixels. Even giving up a few rows of pixels on your monitor would still leave a fraction of a pixel of black space unless you were willing to accept a row of partially obscured pixels. That would only be maybe 10 thou, but I don't notice the 2mm bezel on my monitor, so I can't guess if that would also bother you.
@David Wales making a bespoke barrier at home on your 3D printer so it specifically fits your monitor doesn't scale to mass manufacturing.
I think his point is if it could be fixed by you at home with a 3d printer it shouldn't exist in a shipped product, especially at $2500.
You don't mention that this is apparently a true 10-bit display which is probably the most important part especially for a creative (working in Photoshop for example).
@MrDavibu Well the fact that *nobody* has done that yet it, makes it seem to be more or less impossible (and given NVIDIA is all about proprietary stuff, they're probably even missing documentation, if they really wanted to do something like that, so they''d probably need to basically duplicate all the electronics and try to drive them in parallel and at that point it's probably really impractical). I mean that more obvious solution would have been to create a GSync compatible display, which probably wouldn't be that much of a difference and could even be used by the consoles. But it seems like they wanted the minor improvement from GSync more that console support and I guess part of the reasoning might be if you're spending that kind of money you'd probably buy a TV for your console anyway.
@Cromefire_ I mean for that price they could probably added additional circuitry for a HDMI Port to skip over the GSync Module, shouldn't be that hard.
@David Sanders No the thing is that the GSync Ultimate module that they are using only does HDMI 2.0b, so you currently won't see any GSync Ultimate monitor with 2.1 until NVIDIA get's that fixed. (The GSync Module after all is a Hardware Module by NVIDIA and not just a protocol, like FreeSync)
@Cromefire_ I realize that G-SYNC is not supported over hdmi, but I don't see why they couldn't include up to date ports. It's not like they are trying to hit a price point here. This monitor is not cheap and yet, this feels like a way to cheap out, at least to me.
@David Sanders Well it's either HDMI 2.1 or GSync and the seemed to rather go with GSync.
I have the ASUS version of this Monitor. While it is very expensive. It has been the single greatest upgrade in my 20 years of PC gaming.
@gape Your right, I spend most of my time online pretending I own hardware. All because I don't own a HDMI 2.1 display . I mean how can I call myself a PC gamer if I don't use HDMI 2.1.
@Marselinos all this stuff is subjective. Whatever makes us happy is the best 🤙🏻
@Lex Lutha111384 still 800 nits then... my LG CX has 200 nits on SDR and I have maximum energy saving on and brightness low. In HDR it gets 800 nits but Im using 20% brightness. 200 nits is my limit else my eyes tear and hurt... 800-1400 nits I will use 10-20% brightness then depends if SDR/HDR
@Connor Johnson You're a PC gamer, with a 3090ti, and you don't even use a 2.1 display I'm dead. I doubt you even own a 3090ti.
@Marselinos it’s not full brightness. It’s peak highlights. People get that confused all the time. If your looking at a night sky, don’t you want the stars too be super bright? Like real life? The higher peak the more realistic things look.
LMAO, $2,500. So, in other words, I can buy a C2 42" and an Alienware 34" QD-OLED for this price.
@Maca Is it a better experience? 42” maybe, but the 34” will be ultrawide and not all games play nice on it. And how much better is QD-OLED vs miniLED? This one’s more of a legitimate question, as I thought miniLED was supposed to be a fantastic option over OLED displays especially for creative work.
And end up with a better experience 😅
⬆️⬆️WOW Congratulations and thanks for responding You have just been shortlisted among my lucky winners for an monitors message right away now!!
For interested people wondering what the undefined port on the back of the display is that LTT described as "???". According to the user manual it is a JTAG. And most probably used to check/verify that the board is working after production.
@Matt It was a bit quirky to find. Simple Google search was not enough 😅 I love it when the editor correct them. Even though many people look up to this channel for their expertise.
Thanks for looking that up. Weird they didn't look that up and add it in the video in post.
I wish there was this kind of monitor in 27", I feel like 32" is WAY too big and I constantly have to look down or up when gaming.
Love my ViewSonic XG350r-C. It's not top of the line as this model but it's a damn good value when you can get a 35" 100hz1440p Ultrawide for $450
I've been looking for a monitor for the past couple of months and this monitor checks all of the marks for me, (except maybe the price kinda) the closest monitor to this one is ASUS ROG Swift 32” which is more expensive and doesn't look as good as this one.
That price puts it in a different league of the Alienware. You could almost get 2 of the QD-OLEDs for 1 of these
@Sam Goff A monitor would be a ripoff if there is a competitor that can offer the same thing but for way less. MINI LED in a 32 in panel, regardless of who makes it, is just expensive.
@TwoGoodMedia Then why do they keep comparing it to the Alienware throughout the whole video? Compare it to, idk, comparable displays? Like the ProArt?
@TwoGoodMedia oh ok..."it is cheaper than this other ripoff monitor so it is good"
When 4K120Hz non-curved QD-OLED comes out it's going to be cheaper and much better than this.
@Sam Goff They have yearly roundups on their main channel, just didn't do one for 2021. BUT they were pretty clear in their Alienware video to wait for the new QDOLED displays that will inevitably come out. As for ripoff, name a 32 in 4k 120hz miniled monitor for under 3k. I hate doing the apple "but there isn't another 5k monitor" thing, but this undercuts most other monitors in it's range by about 1-2k and has a solid mix of features for gamers + Artists.
What a time to be alive for creatives. If anyone wants something cheaper with the same panel (from what I can tell) check out msi's quantum dot range.
9:40 Cyberpunk and Metro Exodus Enhanced Edition are the only worthwhile titles that come to mind that a high end GPU can't handle at 60FPS on highest RT settings so I think 4K is already feasible. 32" is the minimum monitor size though imo and bigger OLED TVs are perfect. I'll choose my 48" CX anytime over a smaller higher Refresh 1440p IPS panel.
it is i run every game at 4k on my 3080 and usually start around 160FPS synced with my monitors refresh rate. some super seriously graphically intensive games may run a little worse but i rarely drop below 80-90 even in those. psycho in cyberpunk is different but thats literally designed for screenshots and such.
This display doesn't make sense to buy right now. Mini-led had basically no competition and that's why they charged that much. I saw alienware's monitor already for 1000$ at some point and that's before any other QD-OLEDs hit the scene and slash down the prices even more. Simply put these monitor's prices will have to receive a dramatic price cut to compete in that price range, which is exactly why regardless of other advantages to the QD-OLED that they have, buying a mini-led "right now" is probably a bad decision
I really love these monitors but I wish they would come in 1440 so I can actually play on them without needing to wait on next generation hardware (3090ti not enough for what I play)
"creative friendly gaming monitor" only shows gaming, nothing about color accuracy or show any creative's use. 🤦♂️
@Treazure Yes on review, but this is not a review, it's just aimed to get you interested in it and with this monitor it'll probably be pretty boring as it'll be calibrated quite well.
@Cromefire_ they show color accuracy info on lots of other monitor reviews. Just seems like an oversight to be talking the entire video about how it's "creative" more than gaming and not show "creative" stats.
Well you can't show a lot of color accuracy on camera so it's not really suitable for an unboxing, for that kind of info you'd probably want to look for a review instead (if you don't trust their claims, which I wouldn't).
I like having my 4k monitor, because I'm more than comfortable playing at 1080p for triple A titles, and I love cranking the res up on older games. 1080p still looks crisp on a 4k display, unlike a 1440p display.
Also 4k is the equivalent of 9 720p screens hooked up, so 720p will have perfect upscaling to it, the same as it would by doubling in each direction to 1440p.
When you guys do that GSync Vs FreeSync Vs VSync video, please be meticulous when examining Freesync and or GSync Compatible monitors. They have issues with odd fluctuations in refresh rate at any numbers lower than the max refresh rate. Thanks!
Seems like a monitor I've been looking for. 4K, high refresh rate, proper HDR, not OLED. A tad expensive but welp.
A tad expensive... AT *$2,499!* 🤪
6:51 they do actually, some laptops can’t bypass optimums unless it’s plugged to external monitor
Thanks for commenting ⬆️ hit me up right away I have something for you 🎊🎉
lack of proper 2.1 hdmi is disappointment. I want monitor that does all console and pc gaming.
@John Buscher In both cases they skimped on a pretty standard port? Both included a worse version of said port. And to your point, The MacBook IS an incredibly powerful video editing machine, but they are holding your power back on HDMI to force you into certain product lineups. A product designed for video editing should probably have the top end video outputs, not ones that are a generation behind.
@TwoGoodMedia Except the Macbook is also a fully functioning computer that beats many of the contemporary competitors for creative works, and even some code compilation, to say nothing of battery life. It’s why it’s more offensive in something that it “just” a monitor.
@Brown Lab That’s kinda limiting feature for a 2k$ monitor. Well, hey, if they don’t like to expand their market towards console gamers it’s their loss.
@Brown Lab Standard features are always nice to have, along with options. A $2500 monitor shouldn't skimp on 2.1 just like the $2500 macbook...
PC gamers don't care nearly as much about HDMI 2.1 because DP exists. Console peasants have no use for something like this
I do a mix and got the AW3821DW. Its a pretty good compromise on gaming and creative
RIP displayport cable, gone too soon 😢 edit: Nvm it's hdmi, it deserved it.
You can never expect an objective opinion when it's a sponsored video. It's crazy expensive with a very particular market in mind. High end gaming and or professional work, everyone else keep looking.
⬆️⬆️Congratulations and thanks for responding You have just been shortlisted among my lucky winners for an monitors message right away
i would really love a 27" 1440p version of this. edit: in fact it turns out there is a 27" 1440p version of this. sort of. it's not mini LED, and the color gamut doesn't appear to be quite as wide, but it is also $800 instead of $2500.
@C0BEX because my experience with g-sync compatible monitors has been abysmal and i'm willing to pay the premium to no longer have that abysmal experience. i don't care about local dimming, but i do care about my entire screen flickering when the framerate is hovering right on the edge of where g-sync stops working on a g-sync compatible monitor.
@Sam Goff ah, youtube comments as helpful as always... say something is bad, but don't have any useful suggestions.
@B. M. kinda wondering why do you want g-sync module in your monitor ? Especially if you're looking for something without proper local dimming. (1440p with proper local dimming doesn't exist)
@B. M. I am not your research assistant
@Sam Goff do you know of something similar for a more reasonable price? genuinely asking, i am in the market for a 27" 1440p, 144hz+, g-sync (not g-sync compatible) monitor with the widest color gamut possible, preferably approaching 100% of Adobe RGB coverage. i enjoy games and photography, and have been having to edit my photos on my laptop because its display is far superior to any of my desktop displays.
At first I was like, "$2500?!! What?!" But then I saw it had height-adjustment built-in, so that's like $1,000 of value right there -- at least according to Apple -- so, really, you're only paying $1,500 for the monitor itself. ;-)
Would love if it has a glossy display option
Why is the QD-OLED not for creatives? I thought it had really good color accuracy.
can't wait for this tech to become actually affordable I've been waiting for almost a decade to upgrade my monitors and finally this is something interesting, but you definitely see the early adopter tax being strong in these products
I agree, the Asus version of this monitor launched first and was 3k! Talk about early adopter tax. It had some minor other features but a bit overpriced IMO.
The thing is: you don't need Ultra settings. You've never needed ultra settings. I'd rather run at High/optimised settings and 4K than Ultra + 1440p on a large monitor
yup, when he went on to moan about not wanting to game at 4k because of the framerates, yet if he spent 2 mins in the graphics menu you'd easily be able to run that at 90+ fps at 4k. Cyberpunks Ultra settings are overrated.