help with terminology

help with terminology

Post by Paul Compt » Wed, 14 Apr 2004 03:11:23


Hi
Please excuse my "interrupting" your newsgroup life! It is too many years
since I studied AI, and I can't remember my terminology. These days I work
as a translator, and I was recently asked for the English for the German
term "Absenkungsfaktor" in the context of neural nets. I can translate it
in general terms without a problem, but that doesn't help!

So, anz German-speakers there, can you help me please? Many thanks!

Paul Compton
Hamburg
 
 
 

help with terminology

Post by Paul Compt » Thu, 15 Apr 2004 14:24:22


It seems that nobody here can help me - or that I have approached the
group incorrectly, somehow. Perhaps someone could at least point me in the
right direction, i.e. where I may find the answer?

Many thanks once again
Paul Compton

 
 
 

help with terminology

Post by zboge » Fri, 16 Apr 2004 22:18:34

Try getting in touch with Rene Weber XXXX@XXXXX.COM or XXXX@XXXXX.COM .

Zvi Boger

OPTIMAL - Industrial Neural Systems
 
 
 

help with terminology

Post by Wil Hadde » Sat, 17 Apr 2004 01:36:01

Absenkungsfaktor - Disadvantage factor

At a guess I'd say it means negative weight.






---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system ( http://www.yqcomputer.com/ ).
Version: 6.0.610 / Virus Database: 390 - Release Date: 03/03/2004
 
 
 

help with terminology

Post by konar.laro » Mon, 19 Apr 2004 06:01:16

Hello Paul,

I don't know any meaning of "Absenkungsfaktor" in the context
of neural nets in German. It's rather an unusal word combination
where "Absenkung" is referring to a geological event (drawdown).
=> Gradient Descent???

Are you able to post the whole sentence or paragraph?

The word "Absenkung" implies that it should be a number that
reduces another number.
Thus an "Absenkungsfaktor" ("Faktor" means multiplication)
should be smaller than 1.

[Result] = [Number] multiplied by ["Absenkungsfaktor"]
y = x * b
80 = 100 * 0.8

The result (product) y when a number x is taken b times - where
b is the "Absenkungsfaktor" - is smaller than x.

Regards,
Konar
 
 
 

help with terminology

Post by Fred Mailh » Mon, 19 Apr 2004 10:50:58


[snip]



Sounds a lot like "learning rate", then....


Fred.
 
 
 

help with terminology

Post by Paul Compt » Mon, 19 Apr 2004 20:30:23


Was my initial thought, but in terms of Absenkung being more a reduction
(subsidence in geological usage) I could hardly imagine that. Here is
a description of the context from the guy who asked me - which makes it
clear that it is not "learning rate" but must come pretty close to
defining it:

###I programmed a small application to use a Kohonen Self-Organizing
Map(SOM)for cluster analysis. It's a 2-Layer ANN (input and output-layer)
with a1-dimensional output layer. The underlying algorithm is unsupervised
anduses a linear decreasing learning function. The learning rate lr will
bedecreased at every iteration step using "lr_new = 'Absenkungsfaktor'
*lr_old".###

Thanks for your help.
 
 
 

help with terminology

Post by Paul Compt » Mon, 19 Apr 2004 20:31:17


Here again (as posted a few minutes ago on another subthread here) is the
description given by the programmer who asked me, when I pressed him for
more info:

###I programmed a small application to use a Kohonen Self-Organizing
Map(SOM)for cluster analysis. It's a 2-Layer ANN (input and output-layer)
with a1-dimensional output layer. The underlying algorithm is unsupervised
anduses a linear decreasing learning function. The learning rate lr will
bedecreased at every iteration step using "lr_new = 'Absenkungsfaktor'
*lr_old".###

Thanks for your help.
 
 
 

help with terminology

Post by Paul Compt » Mon, 19 Apr 2004 20:32:17


The day-to-day life usage was what I also pressed the questioner for in
the end, and I include it here for the sake of completeness - all
subthreads have it now!

###I programmed a small application to use a Kohonen Self-Organizing
Map(SOM)for cluster analysis. It's a 2-Layer ANN (input and output-layer)
with a1-dimensional output layer. The underlying algorithm is unsupervised
anduses a linear decreasing learning function. The learning rate lr will
bedecreased at every iteration step using "lr_new = 'Absenkungsfaktor'
*lr_old".###

Thanks for your help.
 
 
 

help with terminology

Post by Paul Compt » Tue, 20 Apr 2004 05:36:22


Thanks heaps Fred. I spent the morning reading through texts on the
subject, bringing myself a little way back up to speed - unbelievable how
much one "unlearns" over a period of only two years away from AI. Could
have something to do with the memory-impairing illness I guess. :-(
Anyway, I came to the conclusion that my interlocutor has programmed an
extreme simplification, in that most SOM algorithms suggest some other
form of formulaic decrease. So, I was also wondering if there can be a
regular term. This is probably why I haven't found it in any of my
technical references in German either! I will probably opt for "learning
rate decrement" and your own suggestion as the possibilities to present to
him.

Many thanks

Paul Compton
 
 
 

help with terminology

Post by Fred Mailh » Tue, 20 Apr 2004 08:25:38


Ah...I see now...many (most?all?) convergence results in ML for iterative
methods that use learning rates require that the learning rate decrease in a
particular way...

I don't think there's an English term for the factor by which the learning
rate is decreased, though...you might consider using "the factor by which
the learning rate is decreased" as a possible locution, though...


Cheers,

Fred.
 
 
 

help with terminology

Post by heat » Thu, 22 Apr 2004 12:25:10


I don't read or speak German. However, from the context of the other
replies I vote for learning rate 'reduction factor'.

Hope this helps.

Greg
 
 
 

help with terminology

Post by Wil Hadde » Thu, 22 Apr 2004 17:04:24


I ammend my vote to the same.

Wil