Doctors who deal with hand problems may be plastic surgeons, orthopedic surgeons, or general surgeons. One source says:
"Hand surgeons are orthopaedic, plastic, or general surgeons who have additional training in surgery of the hand. To become members of the American Society for Surgery of the Hand, hand surgeons must take a full year of additional training and must pass a rigorous certifying examination."
It doesn't seem that there is a single special word for it, like "podiatrist" for a foot doctor.