• 0 Posts
  • 397 Comments
Joined 8 months ago
cake
Cake day: March 20th, 2024

help-circle

  • The only thing I’m really curious about is how far back the CPU gets throttled with the dGPU active and busy.

    On both of my machines when I render a video using my GPU the CPU is still the limiting factor because of the codec I chose. On my 11th gen machine it took like 5 minutes before it was power throttled down to 25 watts. My gen 6 takes longer to power throttle and only goes down to 35 watts, but either power level that sucks. I already know the gen 7 dials back the clock speeds, but I’m mostly curious how far it goes and how quickly?

    The easiest way to test this is just open a video game that’s taxing on the CPU and GPU, I don’t think the CPU throttles with light loads like if you opened furmark. Maybe benchmarking software would cause it to throttle.






  • Let me know how the thermals are on that machine. I ended up paying out the ass for a refurbished gen 6 because it comes with the 4090 and a MUCH bigger heatsink. From what I saw initially in the reviews the performance is worse not just because the 100 series has worse IPC, but the machine doesn’t actually boost as much since it’s more thermally limited.

    HOWEVER the machine gets a LOT better battery.

    My gen 4 would get anywhere between 30 minutes and 2 hours of battery life unless I’m doing literally nothing on it. This gen 6 gets like 4 hours unless I’m heavily taxing it. But from people online I saw them say 7 hours is easily doable. And having a GPU that doesn’t use 20 watts sitting idle sure helps.


  • Which series? T/P or one of the economy options? The T, X, W, and later on P series have been the only models people really like.

    We have a few T series at work and they’re not bad. My T14 Gen. 1 doesn’t thermal throttle at all as long as its thermal paste isn’t toast. It will run at basically its full all core boost speeds all day long. The newer 12th Gen. machines dial their clocks back a smidge under full load, but that’s because they have 2x the cores of my measly 10th Gen. machine.

    Also I have a T14s AMD and that thing is a BEAST for such a small machine. 35 watts out of an AMD 6 core is no slouch for something that small. And I easily get 7+ hours of battery life out of my abusive use.





  • I actually really dislike DLSS and FSR. At least to me the upscaling is pretty noticeable (but not the end of the world), but the artifacts that it causes drive me insane. I haven’t tested Onion Ring with it. But for example FH5 I get all sorts of ghost images on my screen and they go away as soon as I turn off DLSS.

    Also weirdly on my laptop I got worse performance. I’m assuming that’s because I’m 100% CPU limited and there is a bit of CPU overhead to running DLSS.








  • If you’ve got a thunderbolt port on your laptop and a thunderbolt dock on your laptop then there’s no reason why it shouldn’t work.

    I’m not familiar with thunderbolt on linux, but on windows you plug it in and it just works™️ and shows up as if it was inside your machine. Your DE on linux might automatically do it, but if you’re command line only you’ll probably have to run a command first.