Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

NNadir

(33,525 posts)
Sat Jan 23, 2016, 10:05 AM Jan 2016

The "Extreme Learning Machine."

I'm most definitely snowed in today, and am leafing through some issues of one of my favorite journals, Industrial Engineering and Chemistry Research and I came across a cool paper about one of my favorite topics, ionic liquids, that discusses the "Extreme Learning Machine."

Ionic liquids are generally salts of cationic and anionic organic molecules which are liquids at or near room temperature. Because they are generally not volatile, they can eliminate some of the problems associated with other process solvents, specifically air pollution. Although the term "green solvent" is probably over utilized with respect to ionic liquids, their very interesting potential uses have lead to a vast explosion of papers in the scientific literature concerning them. There are, to be sure, almost an infinite number of possible ionic liquids (and related liquids called "deep eutectics".)

My own interest in these compounds is connected with my interest in the separation of fission products and actinides in the reprocessing of used nuclear fuels, as well as an interest in their potential for the treatment of certain biological products, including lignins, a constituent of biomass that is quite different from cellulose, representing a sustainable route of access to aromatic molecules, as well as their possible use as radiation resistant (in some cases) high temperature heat transfer fluids.

Anyway, about the "deep learning machine:" The paper in question, written by scientists at Beijing Key Laboratory of Ionic Liquids Clean Process, State Key Laboratory of Multiphase Complex Systems, Institute of Process Engineering, Chinese Academy of Sciences, Beijing 100190, China, that I've been reading is this one: Ind. Eng. Chem. Res., 2015, 54 (51), pp 12987–12992

The Sσ‑profile is a quantum mechanical factor describing the charge distribution of the surfaces of molecules and organic ions.

Here's the fascinating text:

As compared to the ANN algorithm, the extreme learning machine (ELM) is a relatively new algorithm which was first developed by Huang et al.23,24 It can effectively tend to reach a global optimum and only needs to learn a few parameters between the hidden layer and the output layer as compared with the traditional ANN and thus can beused to predict properties because of its excellent efficiency and generalization performance.25 However, to the best of our knowledge, the ELM has not yet been used for predicting the properties of ILs until now. Thus, we employed this relatively new ELM algorithm to predict the heat capacity of ILs in this work.


Reference 24 is" Huang, G.-B.; Zhu, Q.-Y.; Siew, C.-K. Extreme learning machine: Theory and applications. Neurocomputing 2006, 70, 489−501.

Hmm...the program needs to "learn" only a few parameters...

I always keep in the back of my mind Penrose's criticism of the concept of "artificial intelligence" (maybe because being a human being, I still want my species to be relevant) but I'm intrigued. Neurocomputing is a journal I've never accessed before, but when I can get out of here after this blizzard, I'm going to take a look at that paper which is apparently available at Princeton University's library.

I guess I'm a dork, but I find it all kind of cool...
6 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies

phantom power

(25,966 posts)
1. ELMs are nice because you can solve an entire weight layer simultaneously
Sat Jan 23, 2016, 03:48 PM
Jan 2016

Instead of bumping down the gradient in the more traditional Boltzman-style. Also makes it nice for scale-out parallel computing environments like Hadoop or Spark.
It does require some compromises when training a multi-layer network. You end up optimizing one layer, then another... kind of like tightening bolts on a gasket, you circle around a few times until everybody is tight.

NNadir

(33,525 posts)
2. Your reply certainly stimulated some thought on my end...
Sun Jan 24, 2016, 10:17 AM
Jan 2016

Actually I'm fairly ignorant of the "nuts and bolts" of computer science, although I did see in Google Scholar that the "Extreme Learning Machine" concept has been around for quite some time.

I have a general feel for computational chemistry theory, the general concept of the Kohn-Sham theorem for "normal" density functional theory, and am trying to wrap my little brain around "orbital free density functional theory" but I don't know how it all works on a computational level.

From what I gather from poking around the internet, throwing in some of your lexicon, while snowed in - and I'm not likely to be able to leave this house without many hours of digging - the concept of "machine learning" involves using a "learning set," or "training set," computing a "best fit" using (hopefully convergent) calculations, and then weighting the results to approach a better fit.

For many chemists, myself certainly included since I am running out of time on the planet and will never have the time to learn the details, the "nuts and bolts," of computational science it's all rather like ordering a meal in a fine restaurant. You don't know how the food was prepared, but you enjoy the taste anyway.

My youngest son, still in high school, is considering a career in Materials Science, and I hope he will be inspired to learn more about these important "nuts and bolts" than his old man did.

Thanks for your stimulating comment!

Massacure

(7,525 posts)
3. What you gathered from poking around the internet was correct.
Mon Jan 25, 2016, 11:34 PM
Jan 2016

I graduated with a degree in computer science, and I took an artificial neural networks class that one of the professors only offered one session of every other year. It was basically a graduate level class toned down a bit for upper level undergraduate students. I'll never forget one of the homework assignments we had - given the size of a bunny's pupil, predict its age. The grade we received was dictated how close our predictions were. Our professor had to reassign the homework assignment to us because most of us were so far when we submitted it the first time around. I reminisce about it now, but that was a stressful assignment back when I was working it. Finding algorithms that work for the training involves a lot of trial and error.

Anyways, your comments about using a learning set to train the network and using the results to tune it is spot on. One of the caveats though is that you don't want an algorithm that is too convergent, otherwise the network becomes too trained. This is particularly true when data points gradually shift over time - the network needs to be able to "forget" to some degree what worked in the past. A good example of this might be a network taught to predict the winner of a a football game played in the NFL. The NFL has become more offense oriented and offenses pass oriented over time, so an algorithm that was trained on games from this season may work really well for games next season, but be awful at predicting games ten years from now.

jakeXT

(10,575 posts)
4. One application of machine learning is text classification
Thu Jan 28, 2016, 08:28 AM
Jan 2016

For example a spam filter that decides between a spam email and a normal email.

https://en.m.wikipedia.org/wiki/Naive_Bayes_spam_filtering

Latest Discussions»Culture Forums»Science»The "Extreme Learning Mac...