Talk:Music perception: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Daniel Mietchen
(started)
 
imported>Felipe Gerhard
No edit summary
 
(3 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{subpages}}
{{subpages}}
I want to build my presentation around the question "Will there ever be a computational model of music perception?". For the visual sensory system, people are developing models for the early processing stages and more advanced for scene and object recognition. I don't know about the development in the auditory domain.
One would have to discuss models of early auditory processing (things like volume, pitch, harmony), then going on to aspects like rhythm, melody and music itself. The question will raise what to expect from a possible model of music perception, what could it practically account for?
I see at least one connection between the notion of "music" and possible neural models: Music is a highly-structured input and might therefore help to shape the neural circuitry for auditory processing (similar as natural images (as structured visual input) can shape V1-like receptive fields with neurobiological plausible plasticity rules).
[[User:Felipe Gerhard|Felipe Gerhard]] 10:08, 10 July 2008 (CDT)
----
In the end, it turned out to be a bit more general than I thought.
What is still needed for a complete article: Build in the references and cross-links to other topics. Add the picture of the modular framework.
[[User:Felipe Gerhard|Felipe Gerhard]] 18:34, 6 September 2008 (CDT)

Latest revision as of 17:34, 6 September 2008

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
Video [?]
 
To learn how to update the categories for this article, see here. To update categories, edit the metadata template.
 Definition The study of the neural mechanisms involved in people perceiving rhythms, melodies, harmonies and other musical features. [d] [e]
Checklist and Archives
 Workgroup categories Psychology, Music and Biology [Categories OK]
 Talk Archive none  English language variant British English

I want to build my presentation around the question "Will there ever be a computational model of music perception?". For the visual sensory system, people are developing models for the early processing stages and more advanced for scene and object recognition. I don't know about the development in the auditory domain.

One would have to discuss models of early auditory processing (things like volume, pitch, harmony), then going on to aspects like rhythm, melody and music itself. The question will raise what to expect from a possible model of music perception, what could it practically account for?

I see at least one connection between the notion of "music" and possible neural models: Music is a highly-structured input and might therefore help to shape the neural circuitry for auditory processing (similar as natural images (as structured visual input) can shape V1-like receptive fields with neurobiological plausible plasticity rules). Felipe Gerhard 10:08, 10 July 2008 (CDT)


In the end, it turned out to be a bit more general than I thought.

What is still needed for a complete article: Build in the references and cross-links to other topics. Add the picture of the modular framework.

Felipe Gerhard 18:34, 6 September 2008 (CDT)