Techtalks: Traveltech

Tensorflow Certification Exam

Sorry if this is something it was just asked, but I searched for it without success. I am thinking about to apply for the tensorflow certification exam. My first question is if, during the exam, the custom activation functions are allowed.

For example: Imagine a question about a regression when the data is:

features = np.array([0.0, 1.0, 2.0, 3.0, 4.0, 5.0, 6.0], dtype=float)
targets = np.array([0.0, 1.0, 4.0, 9.0, 16.0, 25.0, 36.0], dtype=float

This is clearly a x^2 problem. Could I do something like this?

tf_lpow = lambda x: tf.math.pow(x, 2)
model = tf.keras.models.Sequential([
    tf.keras.layers.Dense(units=1, activation=tf_lpow, input_shape=(1,)),
])

Considering that maybe this could not be allowed, I was thinking about another solution:

lr_scheduler = ReduceLROnPlateau(monitor='loss', factor=0.75, patience=50, min_lr=3e-80)
callbacks = [lr_scheduler]
model = tf.keras.models.Sequential([
    tf.keras.layers.Dense(units=6, activation='sigmoid', kernel_regularizer=tf.keras.regularizers.l2(0.01), kernel_initializer=tf.keras.initializers.RandomNormal(stddev=0.01), input_shape=(1,)),
    tf.keras.layers.Dense(units=1, activation='linear')
])

But even in the case the loss is decreasing, the accuracy is stack in 0.2857, not reaching the goal. In this case, what could I do?

Thanks in advance.

image placeholder
Nikhil sharma

asked  Oct 7, 2022

3700views
0answers
0votes