The U.S. has become significantly less Christian in recent years as the share of American adults who espouse no systematic religious belief increased sharply, a major new study found.

For what is probably the first time in U.S. history, the number of American Christians has declined. Christianity, however, remains by far the nation’s dominant religious tradition, according to the new report by the nonpartisan Pew Research Center.

Read more in the Los Angeles Times.