The armored Apex Stealth fans promise the “impossible”

Alphacool Apex Stealth

Alphacool has unleashed the new Apex Stealth fans on the world, which attract attention for two things in particular. Firstly, the fact that they contain robust metal elements and secondly, the specifications. These seem to be literally incredible, and while we have no doubt that the efficiency of these fans will be top notch, it’s hard to find elements for which they should be “the best”, as the specs suggest.

Compared to competing fans, the Apex Stealth is clearly differentiated by the use of a metal frame cover. This is die-cast zinc, and you can clearly see from Alphacool’s very nice illustration how it is implemented into the fan. These are basically flanges that are clamped by a circular frame with the motor. The structure of the latter is already made of some kind of plastic, as is the case with the rotor. It’s hard to say exactly what kind of plastic, Alphacool doesn’t specify it, but it’s definitely not anything metallic.

Unlike, for example, the full-metal all-metal Prolimatech Vertex fans, only the frame cover is metal on the Apex Stealth. To suppress vibrations, there is a rubber ring between the frame and the plastic stator frame. If it doesn’t work perfectly and the fan transfers some vibrations to the frame, because of the high hardness of the material, vibrations may be more efficiently transferred to a case skeleton or a radiator than if the design were made of a traditionally softer material. However, we assume that Alphacool has taken good care of this and vibrations do not reach the frame and when they do, they do so only to a negligible degree that will not be a source of secondary noise (by vibrating the body with which the fan is in contact).

As we mentioned in the intro, the top notch specs are supposed to be the main appeal, which if true, would make this perhaps the most efficient 120mm radiator fan ever. At this point, we would like to pause and elaborate on a few technical details that do not support such features. By the following we certainly do not want to question the work of anyone from Alphacool and on the contrary we take our hats off to them for an extremely impressive design, but at the same time, in the context of a fair free competition, we would like to somewhat tone down the enthusiasm that may prevail amongst users when looking at the parameters.

There can be no doubt about the top-notch efficiency of the Apex Stealth fans, as they are built on a very efficient geometry, but there are a few question marks when it comes to the airflow and static pressure parameters. At a speed of 2000 rpm, an airflow of 130.5 m3/h is supposed to be achieved. That’s roughly 36% above the design-wise very similar Cooler Master Mobius OC, which apparently also has a slightly wider rotor and yet a comparably sized hub.

The “effective” area figure (made up of the difference between the total area of the rotor and the area of the hub) will thus rather play against the Apex Stealth. The static pressure at the same speed is supposed to be about twice as high (3.88 mm H2O) for Alphacool fans compared to the CM Mobius OC, which is even more eye-popping. This would not be physically possible with nearly identical designs (9 blades with comparable curvature, blade spacing and rotor to frame spacing) even if the hoop-to-frame spacing was half the size. The profile thickness is the standard 25 mm. If someone from Alphacool can come up with a reasonable technical explanation of what allows such a high static pressure to be achieved compared to geometrically almost identical solutions, we will be happy to publish it.

There are two speed variants of the Alphacool Apex Stealth fan. The faster one (400–3000 rpm) has the additional designation “Power” in the name, the slower one (400–2000 rpm) is without it. Both are then available in four colours (with white, matte black, chrome and gold covers). All fans (slower and faster) have the same suggested price of 30 EUR, at which you can pre-order it at shop.alphacool.com.

English translation and edit by Jozef Dudáš


  •  
  •  
  •  
Flattr this!

Alphacool introduces the “Core” series of high-speed fans

Six new models, with the slowest capping at 2,000 rpm and the fastest up to 4,000 rpm. It’s a mix of 120 and 140-millimeter fans, where with the larger format Alphacool reached for differently modeled rotors. This is also with a view to ensuring that a fan this fast and relatively large can operate at all at reasonably low vibration. Among Alphacool’s new products is therefore a fan with more robust blades as well. Read more “Alphacool introduces the “Core” series of high-speed fans” »

  •  
  •  
  •  

Plastic vs. metal backplate Alphacool (under LGA 1700)

For rather incomprehensible reasons, the cooling performance of the vast majority of AIO liquid coolers is degraded by the use of an unsuitable backplate. The backplate is usually undersized, made of plastic, and cannot exert optimal pressure on the processor. To give you an idea of how a “traditional” plastic backplate stacks up against a proper, steel one, we’ve prepared a comparison of the two. Read more “Plastic vs. metal backplate Alphacool (under LGA 1700)” »

  •  
  •  
  •  

Alphacool: Our new thermal paste is faster than stainless steel

Watch out, Alphacool has started aiming pretty high. In the specifications for the latest Apex thermal paste, it lists a thermal conductivity that it claims is not only above the best competing thermal pastes, but even outperforms some metals. Unless Alphacool is stretching reality, this is going to be an extremely effective formula. You won’t guess what its main ingredient is. Read more “Alphacool: Our new thermal paste is faster than stainless steel” »

  •  
  •  
  •  

Comments (28) Add comment

  1. Perhaps this fan should be put high up in the review priority list. Some dubious data is floating around, if you know what I mean… that plus some mainstream media is hyping up the fan way too much.

    1. Some stuff got to me, but I try to filter it out. Otherwise, I worry unnecessarily about the effect that obvious nonsense has on public opinion. And the latter is what we care about, because we are trying to make sure that everything is clear and makes sense on the issue of fans. On the other hand, I do not think it is right to respond to something only to have to refute someone’s conclusions.

      We have a different roadmap of fan tests and although the Apex Stealth is remarkable and will probably rank among the top, it would probably be unfair to delay the rest even further and it would also be a shame not to build up the much-needed database of 140 mm models, at its expense. Noctua’s new generation of 140mm LCP fans could easily come out as soon as January and we need to be ready for it, and we are a long way from a relevant reference database. We finally found an understanding with Thermaltake and we already have the Toughfan 14 Pro in the editorial office, which I would say from the first glance might be the most efficient 140 mm fan with the best airflow-to-noise ratio even in practice (on obstacles) at the moment.

      Apex Stealth is on our radar, but there are more important things we want to prioritize for the above reasons.

      1. Totally understandable. The 14 cm fan database is definitely a strong priority now. I highly appreciate how you have a roadmap on your own that is barely affected by what others are doing, and it makes sense from a knowledge perspective (for instance, I am amazed how well the Light Wings’ performance and noise characteristics can be predicted based on previous knowledge of the FK120).

        A point I had in my mind is that a slight/occasional focus on new hardware and currently hot topics might bring in more viewership (which I hope this site can gain lots of), but I can understand that it’s not something you’d like to do as you’ll be losing focus on what really matters in the end.

        1. Of course, we are very interested in the tests of new fans at the launch. But we need to know about them first. In the case of Alphacool, no one has contacted us to tell us that something is in the pipeline. But I don’t mean to imply that the problem is someone from them, on the contrary. Communication with Alphacool is very good, but we also have a few things to do for them on which we are behind (tests of older Core fans, which we still haven’t released), so maybe that’s why they haven’t offered us anything else. However, we are preparing the ground for other companies to be able to come out with tests of new fans at the same time as their launch. For example, as it will be in the case of Scythe Grand Tornado and other models of other brands, which I don’t know if I can write about now (but I probably shouldn’t :)).

          But the first priority is always to have enough time to fully test stuff. Rushing just to be the first to have something has never appealed to me (with robust testing methodologies that are time-consuming), though clearly, I’m certainly depriving HWCooling of traffic. But who knows what kind of readership drain we’d have if we had a different approach to working with “quick” (incomplete and obscure) tests, which ultimately I wouldn’t even enjoy doing. 🙂

          1. >we are preparing the ground for other companies to be able to come out with tests of new fans at the same time as their launch… as it will be in the case of Scythe Grand Tornado and other models of other brands

            That’s very good news and I look forward to them. Day 1 reviews with zero compromise in integrity is best of both worlds. The later I do agree is first priority, and it’s part of what makes HWcooling special.

  2. After watching the latest HWBusters/Cybenetics + GamersNexus, I wrote this piece on a SFF discord channel, and just wanted to post it on reddit as well, sry for the wall of text 😅

    With Mike Chen from SPCR + Steve from GamersNexus and Aris from HWbusters/Cybenetics peer reviewing each others frequency analysis, results from anyone else who doesn’t join them are effectively not accurate enough

    I mean Hwcooling is infinitely more accurate compared to Igor’s Lab, and a very good source of subjective conclusions and experience, but a test without peer review is not objectivelly accurate

    Ideally Hwcooling would fully cooperate with GamersNexus HWbusters and Mike Chen to bring their fan specific frequency expertise, and then all 4 would settle on a methodology and explanations for frequency testing

    And the beauty of it, it doesn’t require them to have expensive machines, or expensive chambers, if the cooperating is only for frequency methodology and Hz impact explanations (+ if HWcooling builds/buys their own hemi-anechoic chamber, then they’d be fully equal)

    Additionally, there could be two Tiers, or two Peer Classes

    Hemi-Anechoic Chamber Tier/Class at 6dB SET/~10-14dB GET noise-floor @ 1m

    Semi-Anechoic Chamber Tier/Class at ~15-20dB SET

    And then whoever is the Lead authority on the 2nd Tier/Class can form a “club” set standards for Expreview, HardwareCanucks, Major Hardware, LinusTechTips, Thermaleft, Igor’s Lab, etc, …

    SET = ideal/intented/theoretical noise-floor for a chamber

    GET = actual noise-floor for Hemi chamber due to outside noise/vibration differences

    1. Thanks for the comment.

      I admit I don’t fully understand the part “a very good source of subjective conclusions and experience, but a test without peer review is not objectivelly accurate”. Specifically the use of the phrases “subjective conclusions” and “is not objectivelly accurate”.

      I don’t know what all Gamers Nexus has been saying lately (I try not to pay attention to it, I feel sick from their marketing…), but I do know that first of all they should learn the basics to be able to read and interpret the frequency analysis correctly and meaningfully. There is a huge difference between how they present some things and what they end up testing (or being able to extract from the data). Testing the Fractal Design Terra case in their anechoic chamber, which they don’t need at all for the things they do, is a mockery. And measuring frequency analysis from a meter according to a standard is also impractical and suitable for slightly different purposes. From such a distance (1 m), the small details that distinguish great fans from very good ones disappear.

      I really respect and appreciate the hard honest work that is done on fan tests by ThermalLeft, for example, who you mention. They, i don’t know if you noticed, do cross-verification of our conclusions from time to time. I don’t consider pompous youtubers like Gamers Nexus to be honest, and I definitely don’t consider them to be people with expertise in noise analysis. I’d rather not elaborate further, so that it doesn’t go in the wrong direction here. My attitude to their approach to things is expressed in this blog from the beginning of the year.

      PS: “Ideally Hwcooling would fully cooperate with GamersNexus” I can guarantee you that this won’t happen. One of the driving forces behind my ability to test 18–20 hours a day is that there should be an expert alternative to the tabloids, of which GN is king.

  3. For Subjective vs Objective, was just trying make a difference like in scientific fields, that without peer-review results are not 100% objective, not 100% accurate.

    Until someone with same/similar Lab, chambers, same/similar measuring devices, that are calibrated reviews the same product you did, there’s room for intentional or non-intentional mistakes/issues.

    Once someone else or ideally more people/labs/reviewers get same results (within margin of error), then that’s the results can be considered 100% objective and 100% accurate (within reason again xD)

    GamersNexus is consulting Mike Chin of SilentPCreview . com, who is a many decade veteran of testing in relation to noise frequency spectrum graphs, and general noise. He + another 3rd party Chamber contractor have been teaching GamersNexus how to read and interpret frequency analysis over the past few months.

    GamersNexus also had a reputable company build them a $250 Hemi-Anechoic Chamber.

    Aris from HWBusters and Cybenetics also built a hemi-anechoic chamber with a ~6dBA noise floor.

    The 3 of them cross-tested 2 cases and 2 handheld device, to peer review each others data (I’m not sure if they used the same measureing devices)

    And for your last sentence, If all of you have a similar enough Hemi-Anechoic Chamber and use Similar/Same devices, then your frequency graphs can be compared.

    That’s what I mean for co-operating, having public visible frequency graphs that can be compared, that either match or don’t (within margin of error)

    1. I’ll try it another way and I’ll start by saying that I respect Mike Chin and I remember him from the old days when the mainstream HW media was not the tabloid it is today. I still admire the HDD noise measurements made by SPCR to this day. They really knew what they were doing. Among my “favourite photos”, of which I don’t archive many, I even have this photo from their anechoic chamber. The only thing I have a strong aversion to is GN, because of their false notion of investigative journalism. In my opinion, they have done a lot of harm quite unjustifiably by applying cherry-picking techniques combined with sensation-seeking to portray some things in a way that they are not in practice. But that is for another time. I would like to explain why I do not consider it appropriate to follow the same procedures for measuring the frequency analysis of fans that are “standardly” used for measuring noise.

      What you are describing sounds nice, and of course it protects the consumer (the reader of the review) from the test author being able to manipulate the data as they wish. Assuming, of course, that all parties involved are 100% objective. But something like a spectrogram is really hard to fake, and I don’t doubt the authenticity of the results across laboratories at all. But at the same time, I know from my own experience that during measuring frequency analysis the way GN (Aris and Mike?) does, it does happen that the resolution is not enough for some elements of the sound and this is due to measuring from too great a distance. Below I will write one example from recent experience.

      We spent a really long time testing the Seasonic Magflow 1225 PWM fan because we suspected that the samples we got were possibly affected by a manufacturing defect that shouldn’t have been present in all of the units. This was manifested by high, narrow tonal peaks at frequencies around 1 kHz. I was convinced that with such expensive fans this must be the result of a rare manufacturing defect, which was not confirmed in the end. These particular samples, which we thought might be faulty, were sent back to Seasonic directly for analysis, and the labs they work with for noise measurements found no anomalies in them. I suppose because of the measurements from too great a distance where these sound components are already lost, but they were also registered by ThermalLeft and for a demanding user it can be a dealbreaker, for which they will choose a different fan. This is to say that measuring from one meter is fine, but not for differentiating between a great and a very good fan. The accuracy may be perfect, but it is low resolution and does not reveal such things (and believe me, after installing six such fans in a case, after the amplification of that critical sound, people will not be pleased with the great fans they have chosen), and thus I do not consider the given measurement method optimal.

      I also respect Aris (his knowledge and experience in PSU testing is admirable) and I know his view on measuring things by standards vs. by your own, customized methodologies. I agree that as long as standardized procedures are followed, simple verification of results by a third party lab is possible. This is all perfectly fine. In practice, however, it may not always be the better option if the resolution of the measurement procedure for a particular case is not high enough.

      We do not plan to build new, standardized anechoic chambers. Simply because, based on my experience, I know that this would not advance knowledge in any way and would be counterproductive. I hope I have explained the reasons “why” clearly. So what if someone focuses on one of the many aspects, takes it beyond what is sufficient, and misses the rest, has no comparison, and does not look at things comprehensively, i.e. from all angles (but only in a very limited way, which they subjectively consider to be important)? The key should perhaps be to match the conclusions of various analyses, which the reader can grasp and understand. Having super-precise absolute values of some observed variable is fine, but in my opinion it is not enough to have the properties of a given product under control and to be able to choose which is better or worse in comparison with another and approximately in what ratio. I am convinced that really useful tests should be based on different foundations. But it is possible that I am wrong…

      1. Yeah, the spectrogram part is why I mentioned, the first thing I wrote was tailored for a specific discord channel, and I copy pasted it without change, so It’s not really the best for Hwcooing, for you guys, cause u already have everything spectogram related out in the open xD, it was more for others, that they have their chamber, measuring tools and spectogram publically available in their respective reviews.

        Take Mobius OC for instance, Machines & More youtuber also noticed problems with his ears, which matches your findings ( he has really good ears, and has often mentioned noise related issues other channels don’t find, and they roughly match your noise analysis as well).

        Fast forward a few months, and there’s a few people on a few Small Form Factor ITX discords that go through tons of fans, and they’ve gotten a few Mobius OC fans that are actually pleasant, so maybe Cooler Master fixed some issues (or not all samples have it). I did point the Cooler Master rep towards your testing, so maybe he advised the company to find another 3rd party lab, so the situation with Seasonic, and Magflow doesn’t happen.

        And yeah I fully agree, there’s gonna noise related stuff that doesn’t get noticed by the most used iso @ 1M standard, which a shame. And it’ also impacts companies.

        jonnyGURU (the famous PSU reviewer and menthor of Aris), who is now the Chief Research & Development Officer for Corsair, often posts in discord channels, and mentions how Corsair has had huge RMA issues in Germany. for noise.

        He mentions that Germany (and possibly Japan), the RMA return reason in Germany, 90%+ are for noise issues, harmonic resonance, Fan PWM circuitry clicking, burst-mode clicking, vibration, inadequate fan curve, shot bearings, etc,..

        Every other country in the world, noise reasons are 1%,2%, always <5% for RMAs, but in Germany Corsair is losing big money whenever their PSUs and other components have noise-related issues.

        He also showed how corsair is testing for this, and it's not 1m xD, they've had music/audio industry grade measuring tools very close to the PSU, in a custom testing way to find this. For the same reason you mention.

        He's tried to point this out to Aris, that testing only @1M is not enough, and would not be enough for Corsair to use Cybenetics as 3rd party lab exclusively. He's still happy about a new certification authority, but hard data, actual money loss in Germany, can't be denied.

        1. Also, all this fan talk happening everywhere, might have spurred jonnyGURU to take more interest in fan design xD. He mentioned he’s happily participating in a new fan design from Corsair.

          You might find this interesting He also finally managed to convince Corsair to stop forcing small triangle mesh side on the exhaust of PSUs, because it caused huge problems for them.

          The triangle mesh was so inadequate compared to standard hexagonal mesh on exhaust, that those newer visually updated Corsair PSUs (especially the SFX/SFX-L prototypes), were much more loud, ran much hotter, just because the new mesh screwed the airflow xD

          Also another thing I found funny is, that Corsair accidentally proved their fans to be inferior xD. Corsair forced jonnyGURU to use Corsair’s own maglev ML120/ML140 fans instead of the standard Hong Hua they used. and the Corsair’s own ML fans had worse performance, inadequate static pressure, and were running at higher RPM and louder than the Hong Hua fans xD

          A direct implication for a company, for the mesh/filter stuff you’ve covered as well

        2. P.S. Above where I mention loses, for Corsair, that’s not correct jonnyGURU, never mentioned that, nor is it my speculation, I just got carried away, and didn’t notice I used too extreme words 🙁

          Paraphrasing jonnyGURU, he said something more in line, huge Headache, huge problems, because having the same complaint on so many RMA emails is a red flag, the divison that tracks RMA reasons, then gets back to the engineers, and asks them to fix that, reduce the complaints.

          What jonnyGURU did wanna point out, is that if any company feels they don’t need to focus on noise stuff, cause it doesn’t lose them any money, Germany will lose them “some money”

          and anyone who’s big or wants to be big in Germany, will have to go the extra mile to fix noise/fan issues

          1. Corsair is a such a giant and moves so many units, RMA complaints add up to a big some of money, other small companies not moving many units, might not notice, and they have shorter warranties so they might not feel it viable to spend money, and employee hours fixing noise issues.

            + Corsair is very fast to discontinue product lines that don’t move in very big numbers, which is not the cause for other companies.

  4. Just the bare minimum “co-operation”, no fan performance comparison, case, cooler, etc, … just

    1. Chambers
    2. Calibrated same/similar measuring devices
    3. Frequency peak graph data

    Like GamersNexus + HWBusters/Cybenetics + Mike Chin are doing right now

  5. I missed a few things/responses xD:

    Mike Chin, the consultant, has pioneered silence reviews and silence tweaks in the PC space since early 2002 (as can be seen on his website), and his work has contributed a lot to reviewers and Factories and Companies.

    I’ve use ThermalLeft videos to listen to fans, together with theFlow2000 and Eiglow_

    And I’ve checked a few of his reviews on quasarzone, and noticed some cross-reviewing comparing of reviews between HWcooling and him yes

    1. Labs work with different equipment, materials and methodologies all the time, sometimes due to funding constraints, sometimes because an alternative method has higher throughput. Different methodologies often have their strengths and weaknesses so there’s not necessarily a single method that is 100% better than the others.

      Results don’t suddenly become invalid if you don’t use the “best” method, or a “gold standard” test. Of course the gold standard test should be compared to when you are developing new methods, but that doesn’t mean other tests are invalid. A conclusion is believable if it can be reproduced by yourself and others using the same method, even more so if different methodology can arrive at the same conclusion. Methods and the results they produce are valid as long as they have rationale and data to support them.

      That said, I am not sure if a “gold standard” even exists in terms of frequency analysis of sound on PC components… For example, the industry standard is measuring noise at 1 m, but whether that distance is actually useful in the real world is questionable. I also don’t think room-sized anechoic chamber is needed for proper noise analysis on something as small as PC fans, honestly. Now the question is if the Longwin machine can be considered the gold standard for airflow measurement.

      1. I will comment only on the last sentence, I would happily sign my name under the rest – we see it the same way.

        Although I have never come across the Longwin LW-9266 in practice (but I believe it will come one day :)), I respect the design of this device. I have studied the technical documentation and there are some things that I have pondered, but it would be really irresponsible to comment on them and somehow evaluate their impact on the relevance of the measurements. When I look at Aris’ results and compare them to ours, they scale well with the NF-S12B FLX, with which he also measured a significantly lower airflow than Noctua states. And that was even at a higher speed (1319 rpm) than the official parameters (i.e. 1200 rpm).

        Noctua, of course, will defend themselves tooth and nail that those specs are accurate because they use this or that measuring machine with perfect calibration, but… the question is whether it’s not just as of late. I personally have trouble trusting the specified airflow figures of Noctua fans older than the NF-A12x25, and it’s hard to justify them compared to current fans even if you try to look for these results in the design elements. Why should the design of the NF-S12B with 1200 rpm have a comparable airflow to the NF-A12x25 with 2000 rpm, and at the same time at a significantly lower noise level?

        1. On the S12B, the blade designs are so different from modern fans I am willing to give them the benefit of doubt. Until your review, of course :). There’s some deviation between your results and Aris’ so it would be nice to see more data first.

          The Longwin LW-9266 definitely functions very, very differently from the HWcooling wind tunnel. Instead of measuring airflow directly using an anemometer, it, from what I understand, measures the static pressure of two chambers and use these values to calculate airflow. Instead of aiming for as little resistance to airflow as possible, it employs screens that minimize the kinetic energy of the air stream, and compensates the resistance via an exhaust fan (which can also be used to create different pressure). Two extremely different designs, yet I don’t see a reason why the result from either should be invalid.

          1. Clearly, Longwin applies completely different principles than our wind tunnel. But I meant that if a fan performs better or worse on the LW-9266, our conclusions will probably never contradict it? I am familiar with the technical implementation and the way of measuring with the LW, but I admit that I have not studied the differences in the results (between it and our device) in detail. I was mainly interested in the results of the NF-S12B and those come out significantly below the official specifications in the Longwin, even at higher speeds.

            I have argued several times about whether our results or the results from LW-9266 are more accurate from a scientific point of view, but no one has ever come up with a relevant argument that would prove that the Venturi effect is pulling the short end of the stick. The claim that our tunnel (compared to LW) does not really have zero static pressure and thus we have a larger deviation from reality in the airflow measurements I do not consider too responsible even as an attack on the short length of the tunnel. If such conclusions do not come from inattention (and misunderstanding of the principle of our tunnel), then certainly from inexperience with this type of measuring equipment. And…we also have some “foreign know-how”. No, I’m joking now, of course. I consider such statements as a sign that someone is not sure of their work and cannot rationally defend it. Especially when there is no such competition in this segment that someone would immediately copy Igor’s great ideas and use them for his own “enrichment”. There is little in the world that is as inefficient and unprofitable as honest testing of fans. Apparently, this is why many of them have to use a little demagoguery (including the non-transparent measurement procedures of Hardware Canucks, if there are any procedures at all, and what they present is not pulled from their bottoms) to sell those fans, where every other tested one comes out “the best”. I think you know what I’m talking about. It makes me extremely angry, because it distorts public opinion in an area where we strive for maximum transparency and to make things as clear as possible about fans. But what can be done, oh well. Most of the media are happy to sell their honour and I don’t even see any real interest in fans. The latter is presented a little differently in my world view. The way ThermalLeft, for example, does it.

            Anyway, I will come back to the LW-9266. It is natural that our results cannot scale with its results 1:1 only because Longwin does not in any way laminarize the airflow in front of the fan. This means that there will often be a slightly different (lower) dynamic pressure at the exhaust. And of course there are many differences, it is a different way of measuring. But I would say that in both cases it is relevant enough to make practical conclusions. When LW is used in combination with obstacles (mounting of which should be possible?), I have no doubt that Gamers Nexus will one day reach some useful results. That is, if they didn’t change their minds already or if they won’t abuse people’s trust again and deliberately present distorted results according to what suits their own marketing, often based on negative critique.

            PS: When comparing our airflow results with the results of HWBusters, it is important to always consider the same speed to avoid excessive distortion. It’s not enough to look at the maximum airflow and assume that the fans are spinning at the same speed. On the contrary, each fan has a slightly different maximum speed and that is why in these measurements we use the average of at least two samples. How Aris does it, I do not know. Even in this respect, as far as manufacturers’ parameters are concerned, I do not think there is a uniform standard at which they indicate the airflow. Some people base it on the maximum indicated speed, others on a nominal voltage of 12.0 V. In the case of BeQuiet! fans, it seems to be as high as 13.2 V, which is unattainable in a PC. I am still waiting for an answer to this question and somehow they do not want to comment on it. Note that we always measure lower speeds for their fans than the ones listed, like this Silent Wings Pro 4 user.

            1. >But I meant that if a fan performs better or worse on the LW-9266, our conclusions will probably never contradict it?
              Yes, that’s my point and it is kind of a response to Dogzilla. Different methodologies can arrive at the same conclusions, if they are designed and executed correctly. I fail to see any faults in your tunnel, and even if there are any, there is no evidence that the results are affected.

              >laminarize the airflow in front of the fan
              I thought only the airflow arriving at the anemometer needs to be laminarized, so that’s new to me. Some elaborations on this topic will be welcome. Or maybe it’s not a “need” but rather a natural result from putting the anemometer on the intake side of the fan?

              Thanks for the reminder on the possible differences in RPM, I do check them in noise-normalized tests to check if the relative levels between fans are different (the data between you and HWBusters agree), but I do often forget to check the RPM at maximum PWM%.

              1. “I thought only the airflow arriving at the anemometer needs to be laminarized”

                Yes, and of course this is also true for the LW-9266. What I was referring to is the level in front of the fan, where the airflow is more turbulent than it would be with a straightener. Think of it as a tunnel with a circular cross-section, which is at least as large at the intake as the cross-section of the rotor of the fan under test. If you place a similar member in front of the fan, the pressure will increase due to the suppression of intake turbulence and you will naturally measure a higher airflow. For this reason, in order to keep turbulence as low as possible (and the exhaust airflow as high as possible), fan manufacturers put various channels on the trailing edges (for example, on the NF-A12x25, Noctua calls them Flow Acceleration Channels, on the P12 this function is fulfilled by Vortex-Control Notches). These only help to suppress microturbulence, which always occurs, regardless of the environment in front of the fan. The straightener I am writing about (and Longwin doesn’t have it) is more of a solution for macro turbulence. The air is drawn too turbulently. But I don’t want to invalidate the device, I just want to explain the differences in measurement on one of the many levels, one that is obvious even without the LW needing to be tested in any way.

  6. @M

    I don’t disagree with you, I’m very sensitive to noise, I can hear harmonic resonance and other noise even in videos from ThermalLeft and others easily, that most don’t.

    I’ve successfully pointed out double-ball bearing from other motors in a double blind test, and I currently use Noctua NF-A12x25 at <850RPM, cause they become audible at ~850RPM.

    testing at 1 meter has significant disadvantages for us who have their computers at 35-50cm distance from our ears in smaller form factor cases with meshfilter side. + noise sensitivity. And I remember someone here, maybe even you mentioning a more expansive ISO Standard then 17025/17065, specifically the ITU-R 468 noise weighting standard that is partially implemented in the ISO 21727. But Aris like Hwcooling expressed that's it too much, and not viable to do xD

    But on the hand it's gotta start somewhere, getting at least something, a small piece of testing standardized or checked with a certification authority, would be awesome.

    @Ľubomír Samák12

    And with how respected Aris/Cybenetics is in the industry and by other reviewers, and considering he's trying to build up a certification authority more than compete as a reviewer. Other reviewers like you could use his LAB as validation. And there's no need for it to ever be public.

    There could be a delay on publishing results, or the results could be published partially. Cybenetics is already doing this for Power Supplies, certain companies ask for their results to not be publically published, or to be published in a redacted manner.

    As for Longwin LW-9266, yeah, your testing method has an advantage of directly having various obstructions like filter/rad/mesh present, and those obstructions have to be present in the sound chamber as well, cause they all change noise behavior of fans.

    And currently neither GamersNexus nor Aris use any obstructions in their Longwin and Hemi-anechoic Chamber results. And yeah I agree both results are not invalid, and I don't think it's viable between competition to compare (+ everyone would kinda need a Climate Chamber, and ~15 randomised samples for variance for full 100% accuracy, and that gets kinda too much.

    That's why I singled out semi-anechoic, hemi-anechomic chambers and frequency peak graphs. That's a smaller part of everyone testing, and everyone has the capability to have same/similar results, so even if there's not an ISO standard,

    Just stating what chamber everyone has, how much dBA ambient it can do, what measuring tools, everyone has, and having noise spectrum frequency analysis base data graph visible publically is already tremendous for us readers/viewers/users.

    If there was more cooperation behind the scenes, sure would be awesome, but just this part (which a ton of you are already doing), goes a long way to help us readers/viewers be confident in results

    And in the future if Cybenetics becomes a viable and widely respected certification authority for many computer parts, everyone could use them without fear of losing clicks, or fears of competiton/

    1. „I’ve successfully pointed out double-ball bearing from other motors in a double blind test, and I currently use Noctua NF-A12x25 at <850RPM, cause they become audible at ~850RPM.“ But that’s mainly because ThermalLeft measures sound from a much shorter distance than one meter. The smaller the separation between non-aerodynamic sounds, the deeper such sounds fall below the resolution when measured from 1 m.

      Aris does excellent lab tests with excellent equipment, but I don’t find the conclusions of its fan tests to be very practical. In short, I don’t know these fans like the back of my hand from these measurements, and I don’t know which fan is “better” or “worse” compared to another. Power supply tests, that’s something else… in that respect I don’t know of anything better and more practical than what HWBusters/Cybernetics does.

      „… he’s trying to build up a certification authority more than compete as a reviewer“ Yes, I see it similarly. They do these things really well, but evaluating fans so that the user gets the most out of it in a practical way is a bit of a different discipline. At least when it comes to fans.

  7. “impossible”?
    That seems pretty strong statement…

    But I’m no expert.
    The idea of putting a round plastic fan inside a heavy metal square fram separated by a small gasket is interesting. Supposedly the heavy metal frame will resist the vibrations better…
    Though maybe the bigger idea is attaching an outer ring directly to the fan blades. So instead of moving fan blades forcing air through a static frame, they use a moving outer ring. So the air only flow through moving blades and ring. Supposedly creating less turbulence. Less turbulence means more airflow, more pressure, and less noise.

    I suppose it could work…
    But again, I’m no expert.

    One thing I noticed that seems out of whack…
    The graphs they have show static pressure as being fairly linear with RPM. And not just with their fan, but also the fans they compare to!
    I am pretty sure this is not right. In basically every decent review I have seen, static pressure is non-linear, generally increasing faster than the RPM of the fan. (Above a certian point it is often similar to a noise vs RPM graph.)
    My observations are that air flow vs RPM is often somewhat linear, but noise vs RPM and pressure vs RPM are NOT.

    Based on this it seems to the that the static pressure charts that they show are simply BOGUS!

    Also, that the Noctua NF A 12 is the worst fan in their SPL vs RPM chart…?
    6 differnt fans total, and the Noctua NF A 12 is the worst?
    Doesn’t match with what I have seen from other tests.

    So clearly both the Alphacool website and igorslab has some bogus graphs.

    1. — “impossible”?
      “That seems pretty strong statement…“

      Why? Fan parameters are based on their design features and I don’t see anything within them that would put Apex Stealth in the position of the most efficient PC fan in the world, as the specifications claim. And even with such a lead in a highly restrictive environment… Believe me, there is nothing I would rather read right now than a satisfactory technical explanation from someone who developed those fans, or went through the specs, with the reasons why this should happen. Sure, there’s nothing easier than writing to Alphacool (we have a contact), but I don’t want them to take this as an “attack on competence”. Especially when the Apex Stealth will undoubtedly be a great fan, but not the “best” one with a significant kick. IF, in hindsight, I turn out to be wrong, you can, of course, consider me a dilettante who doesn’t think about words leaving his mouth. Sooner or later the Apex Stealth fan will also appear in our tests.

      PS: For the older 120 mm ES-120 fan, Alphacool lists a static pressure of 16.06 mm H2O. We have it in our editorial office, we checked it, and although the static pressure is very high, it does not meet the specifications. It looks like there is an overestimation, but again, these are “just” figures on paper, which do not affect the technical qualities of these fans. And it makes me a bit sad when these things (around the parameters) have to be pointed out in a negative way, because whether we want to or not, it is part of the evaluation.

  8. Now looks like alphacool has taken down the very missleading info they had on their website for the fan, and have replaced it with lessor info from more recent testing.

      1. Only the parameters should be a bit lower, if I understood correctly and Alphacool took them from Aris’ tests. According to my information, the official parameters should not take into account the +/- 10% tolerance and should refer to the advertised speed (3000 rpm, not 3091 rpm). Each piece is a slightly different maximum. Evaluating this on the basis of one sample, I don’t know… the noise level also varies from piece to piece. But that’s just a “nitpicking” detail, and it’s great that there have been official corrections to fix the glaring deficiencies.

Leave a Reply

Your email address will not be published. Required fields are marked *