What factor determines electronegativity?
1 Answer
Given the definition of electronegativity, the determining factor is likely nuclear charge (i.e.
Explanation:
Electronegativity is conceived to be the ability of an atom in a molecule to polarize electron density towards itself. Note that I write "conceived", because it is not a fundamental atomic/molecular property, such as ionization energy, or dipole moment. Given this definition, it is easy to see why those elements towards the right hand side of the periodic table are more electronegative than those on the left. First row atoms such as fluorine and oxygen have high nuclear charge (they are on the RHS of the table); it make sense that these atoms are considered highly electronegative on the various scales.
So, the obvious follow up question is why are, say, sulfur, and chlorine, less electronegative than their first row congeners, oxygen and fluorine respectively. Certainly the 2nd row atoms have greater nuclear charge than the first row. The answer is that when you descend a row, a full or complete shell of electrons effectively shields the valence (the outermost) electrons from the increased nuclear charge. After all these years I think I can still remember Pauling electronegativities of the halogens: