The corrosion resistance of high temperature alloy steel castings is closely related to its chemical composition. Whether a stable, dense and highly adhesive oxide film can be formed on the surface of the material in a high temperature and complex medium environment is a key factor in determining its corrosion resistance. The following are the effects of the main alloying elements on its corrosion resistance:
Chromium (Cr) is one of the most critical corrosion resistance elements. It can react with oxygen at high temperatures to form a dense protective film of chromium oxide (Cr₂O₃), which can effectively prevent oxygen, sulfur and other corrosive gases from further invading the metal matrix. Generally, with the increase of chromium content (generally between 18% and 30%), the oxidation resistance and sulfidation corrosion resistance of the material are significantly improved, so high chromium alloys are widely used in sulfur-containing combustion atmospheres or high-temperature oxidizing environments.
Although nickel (Ni) itself is not a strong oxidizing element, it can enhance the stability of the austenite structure and improve the toughness and thermal fatigue resistance of the material at high temperatures. In addition, nickel can also improve the corrosion resistance of the material in reducing media, such as certain acidic environments. The presence of nickel also helps to improve the overall adhesion and repair ability of the oxide film.
Molybdenum (Mo) has good resistance to chloride ion corrosion, especially in preventing pitting and crevice corrosion. It can also enhance the stability of the material in reducing acids (such as hydrochloric acid and sulfuric acid), so it is often used in highly corrosive environments such as chemical equipment.
Silicon (Si) and aluminum (Al) can also form oxide protective films (such as SiO₂ and Al₂O₃). These oxides are more stable than Cr₂O₃ under certain specific high-temperature oxidation conditions, which helps to improve the material's oxidation resistance. However, their addition amount is usually low, otherwise it may affect the material's plasticity and casting properties.
The effect of carbon (C) on corrosion resistance is more complicated. The right amount of carbon can improve the material's strength and wear resistance, but too high a carbon content can easily lead to the precipitation of carbides at the grain boundaries, causing intergranular corrosion, especially during welding or high-temperature service. Therefore, in applications that require good corrosion resistance, low-carbon or ultra-low-carbon alloy designs are often used.
In addition, microalloying elements such as titanium (Ti) and niobium (Nb) can reduce the formation of harmful phases by fixing nitrogen and stabilizing carbon, indirectly improving the corrosion resistance of the material, especially in terms of intergranular corrosion resistance.
The corrosion resistance of high-temperature alloy steel castings is determined by the synergistic effect of multiple alloying elements. By rationally adjusting the chemical composition, excellent protection effects can be achieved in different corrosive environments. For example, increasing the chromium content in an oxidizing atmosphere, adding molybdenum to a chloride-containing medium, and introducing aluminum or silicon under extremely high temperature conditions where oxidation resistance is required are all common optimization strategies.