There are many new things happening on a daily basis. It's just that most of the mainstream media have knowingly and deliberately stopped reporting "all" the news in an unbiased and honest manner. They have instead chosen to report what they want you to see and hear not what we should and need to see and hear. Not only that but the mainstream media (CBS, NBC, CNN, etc) have not only chosen to politicize pretty much everything but are collapsing in large part because of liberal, anti-American content. Which makes one wonder, who controls the mainstream media?
What you said got my attention because I had never heard that before. So I looked it up. I found an article that says in part the following:
"It all started in the 14th century when the English word ‘news’ developed as a special use of the plural form of ‘new’. As the name implies, ‘news’ is associated with the presentation of new information."
Whether or not that article is correct I don't know but it was interesting to look up. I love learning new things and researching them is half the fun. :)