Diffirent results in a function approximation problem using MLPRegressor and Keras Unicorn Meta Zoo #1: Why another podcast? Announcing the arrival of Valued Associate #679: Cesar Manara 2019 Moderator Election Q&A - Questionnaire 2019 Community Moderator Election ResultsMulti-class text classification with LSTM in KerasBinary text classification problem with small label-dataset using kerasKeras : problem in fitting modelHow to obtain with a recurrent neural network the Xor function using keras?Fraud detection using auto-encoders and KerasKeras Loss Function for Multidimensional Regression ProblemUsing Keras to Predict a Function Following a Normal DistributionUsing a custom R generator function with fit_generator (Keras, R)Keras Attention Guided CNN problemKeras inconsistent training results

Why doesn't the standard consider a template constructor as a copy constructor?

Double-nominative constructions and “von”

What *exactly* is electrical current, voltage, and resistance?

Can a level 2 Warlock take one level in rogue, then continue advancing as a warlock?

A strange hotel

Intern got a job offer for same salary than a long term team member

How to avoid introduction cliches

Multiple fireplaces in an apartment building?

Will I lose my paid in full property

Was Dennis Ritchie being too modest in this quote about C and Pascal?

Is this homebrew arcane communication device abusable?

How to have a sharp product image?

What is it called when you ride around on your front wheel?

How to not starve gigantic beasts

Mistake in years of experience in resume?

Why did C use the -> operator instead of reusing the . operator?

Tikz positioning above circle exact alignment

When do you need buffers/drivers on buses in a microprocessor design?

"My boss was furious with me and I have been fired" vs. "My boss was furious with me and I was fired"

First instead of 1 when referencing

Multiple options vs single option UI

Drawing a german abacus as in the books of Adam Ries

Crossed out red box fitting tightly around image

Co-worker works way more than he should



Diffirent results in a function approximation problem using MLPRegressor and Keras



Unicorn Meta Zoo #1: Why another podcast?
Announcing the arrival of Valued Associate #679: Cesar Manara
2019 Moderator Election Q&A - Questionnaire
2019 Community Moderator Election ResultsMulti-class text classification with LSTM in KerasBinary text classification problem with small label-dataset using kerasKeras : problem in fitting modelHow to obtain with a recurrent neural network the Xor function using keras?Fraud detection using auto-encoders and KerasKeras Loss Function for Multidimensional Regression ProblemUsing Keras to Predict a Function Following a Normal DistributionUsing a custom R generator function with fit_generator (Keras, R)Keras Attention Guided CNN problemKeras inconsistent training results










0












$begingroup$


I have different results in a function approximation problem. I am trying to approximate a sine wave using MLPRegressor and Keras (um dense layer)
Here is the code for the MLPRegressor:



import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from sklearn.neural_network import MLPRegressor
from sklearn.metrics import mean_squared_error

#Cria um dataset
X_train = np.arange(0.0, 1, 0.01).reshape(-1, 1)
noise= np.random.normal(0,0.1,100).reshape(-1,1)



y_train = np.sin(2*np.pi*X_train)
y_train=y_train + noise
y_train=y_train.ravel() # transfoprma em 1D array

#X_train = np.arange(0.0, 1, 0.01).reshape(-1, 1)
#y_train = np.sin(2 * np.pi * X_train).ravel()


# Experimentos
#hidden_layer sizes : 1,3, 100
#max_iter=10,100,1000
#
nn = MLPRegressor(
hidden_layer_sizes=(3,), activation='tanh', solver='lbfgs', alpha=0.000, batch_size='auto',
learning_rate='constant', learning_rate_init=0.01, power_t=0.5, max_iter=80, shuffle=True,
random_state=0, tol=0.0001, verbose=True, warm_start=False, momentum=0.0, nesterovs_momentum=False,
early_stopping=False, validation_fraction=0.0, beta_1=0.9, beta_2=0.999, epsilon=1e-08)

#Treina a Rede
n = nn.fit(X_train, y_train)

#previsoes na rede no conjunto de treinamento
predict_train =nn.predict(X_train)

#Plota o treinamento
fig = plt.figure()
ax1 = fig.add_subplot(111)
ax1.scatter(X_train, y_train, s=5, c='b', marker="o", label='real')
ax1.plot(X_train,predict_train, c='r', label='NN Prediction')


#Conjunto de Teste
X_test = np.arange(0.0, 1, 0.01).reshape(-1, 1)
y_test = np.sin(2*np.pi*X_test) + np.random.normal(0,0.2,100).reshape(-1,1)
y_test=y_test.ravel()


#Calcula as previsoes no conjunto de teste

predict_test= nn.predict(X_test)

fig = plt.figure()
ax1 = fig.add_subplot(111)
ax1.scatter(X_test, y_test, s=5, c='b', marker="o", label='real')
ax1.plot(X_test,predict_test, c='r', label='NN Prediction')

plt.legend()
plt.show()

print('MSE training : :.3f'.format(mean_squared_error(y_train, predict_train)))
print('MSE testing : :.3f'.format(mean_squared_error(y_test, predict_test)))


Using MLPRegressor, I found satisfactory results with just 3 neurons. However, when I try to use Keras, I can not get reasonably results. The code is very similar with the exception of the optmizer and the activation function. Here is the code for Keras:



import numpy as np
import pandas as pd
import matplotlib.pyplot as plt

import keras
from keras.models import Sequential
from keras.layers import Dense
from sklearn.metrics import mean_squared_error





#
#Cria um dataset

#Cria um dataset
X_train = np.arange(0.0, 1, 0.01).reshape(-1, 1)
noise= np.random.normal(0,0.1,100).reshape(-1,1)



y_train = np.sin(2*np.pi*X_train)
y_train=y_train + noise
y_train=y_train.ravel() # transfoprma em 1D array


#Construir a Rede
nn = Sequential() # sequencia de camada
#activation
# sigmoid, tanh, relu, linear
# units: numero de neuronios na camada
#primeira camada escondida tem input_dim
nn.add(Dense(units = 100, activation = 'relu',
kernel_initializer = 'random_uniform', input_dim = 1))
nn.add(Dense(units = 1, activation = 'linear'))

# Algorritmo de aprendizado
#sgd = keras.optimizers.SGD(lr=0.1, decay=0, momentum=0, nesterov=False)
adam=keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)

#determina a funcao de custo e a metrica utilizada
nn.compile(loss = 'mean_squared_error', optimizer = adam,
metrics = ['mean_squared_error'])
history= nn.fit(X_train, y_train, batch_size = 1, epochs = 1000)

#previsoes na rede no conjunto de treinamento
predict_train =nn.predict(X_train)



#Plota o treinamento
fig = plt.figure()
ax1 = fig.add_subplot(111)
ax1.scatter(X_train, y_train, s=5, c='b', marker="o", label='real')
ax1.plot(X_train,predict_train, c='r', label='NN Prediction')


#Conjunto de Teste
X_test = np.arange(0.0, 1, 0.01).reshape(-1, 1)
y_test = np.sin(2*np.pi*X_test) + np.random.normal(0,0.2,100).reshape(-1,1)
y_test=y_test.ravel()


#Calcula as previsoes no conjunto de teste

predict_test= nn.predict(X_test)

fig = plt.figure()
ax1 = fig.add_subplot(111)
ax1.scatter(X_test, y_test, s=5, c='b', marker="o", label='real')
ax1.plot(X_test,predict_test, c='r', label='NN Prediction')

plt.legend()
plt.show()

print('MSE training : :.3f'.format(mean_squared_error(y_train, predict_train)))
print('MSE testing : :.3f'.format(mean_squared_error(y_test, predict_test)))


I already tried sgd as optimizer and also tanh for activation function. I do not undestand what I am missing, that is why I cann make the code for function approximation using Keras work.










share|improve this question









$endgroup$
















    0












    $begingroup$


    I have different results in a function approximation problem. I am trying to approximate a sine wave using MLPRegressor and Keras (um dense layer)
    Here is the code for the MLPRegressor:



    import numpy as np
    import pandas as pd
    import matplotlib.pyplot as plt
    from sklearn.neural_network import MLPRegressor
    from sklearn.metrics import mean_squared_error

    #Cria um dataset
    X_train = np.arange(0.0, 1, 0.01).reshape(-1, 1)
    noise= np.random.normal(0,0.1,100).reshape(-1,1)



    y_train = np.sin(2*np.pi*X_train)
    y_train=y_train + noise
    y_train=y_train.ravel() # transfoprma em 1D array

    #X_train = np.arange(0.0, 1, 0.01).reshape(-1, 1)
    #y_train = np.sin(2 * np.pi * X_train).ravel()


    # Experimentos
    #hidden_layer sizes : 1,3, 100
    #max_iter=10,100,1000
    #
    nn = MLPRegressor(
    hidden_layer_sizes=(3,), activation='tanh', solver='lbfgs', alpha=0.000, batch_size='auto',
    learning_rate='constant', learning_rate_init=0.01, power_t=0.5, max_iter=80, shuffle=True,
    random_state=0, tol=0.0001, verbose=True, warm_start=False, momentum=0.0, nesterovs_momentum=False,
    early_stopping=False, validation_fraction=0.0, beta_1=0.9, beta_2=0.999, epsilon=1e-08)

    #Treina a Rede
    n = nn.fit(X_train, y_train)

    #previsoes na rede no conjunto de treinamento
    predict_train =nn.predict(X_train)

    #Plota o treinamento
    fig = plt.figure()
    ax1 = fig.add_subplot(111)
    ax1.scatter(X_train, y_train, s=5, c='b', marker="o", label='real')
    ax1.plot(X_train,predict_train, c='r', label='NN Prediction')


    #Conjunto de Teste
    X_test = np.arange(0.0, 1, 0.01).reshape(-1, 1)
    y_test = np.sin(2*np.pi*X_test) + np.random.normal(0,0.2,100).reshape(-1,1)
    y_test=y_test.ravel()


    #Calcula as previsoes no conjunto de teste

    predict_test= nn.predict(X_test)

    fig = plt.figure()
    ax1 = fig.add_subplot(111)
    ax1.scatter(X_test, y_test, s=5, c='b', marker="o", label='real')
    ax1.plot(X_test,predict_test, c='r', label='NN Prediction')

    plt.legend()
    plt.show()

    print('MSE training : :.3f'.format(mean_squared_error(y_train, predict_train)))
    print('MSE testing : :.3f'.format(mean_squared_error(y_test, predict_test)))


    Using MLPRegressor, I found satisfactory results with just 3 neurons. However, when I try to use Keras, I can not get reasonably results. The code is very similar with the exception of the optmizer and the activation function. Here is the code for Keras:



    import numpy as np
    import pandas as pd
    import matplotlib.pyplot as plt

    import keras
    from keras.models import Sequential
    from keras.layers import Dense
    from sklearn.metrics import mean_squared_error





    #
    #Cria um dataset

    #Cria um dataset
    X_train = np.arange(0.0, 1, 0.01).reshape(-1, 1)
    noise= np.random.normal(0,0.1,100).reshape(-1,1)



    y_train = np.sin(2*np.pi*X_train)
    y_train=y_train + noise
    y_train=y_train.ravel() # transfoprma em 1D array


    #Construir a Rede
    nn = Sequential() # sequencia de camada
    #activation
    # sigmoid, tanh, relu, linear
    # units: numero de neuronios na camada
    #primeira camada escondida tem input_dim
    nn.add(Dense(units = 100, activation = 'relu',
    kernel_initializer = 'random_uniform', input_dim = 1))
    nn.add(Dense(units = 1, activation = 'linear'))

    # Algorritmo de aprendizado
    #sgd = keras.optimizers.SGD(lr=0.1, decay=0, momentum=0, nesterov=False)
    adam=keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)

    #determina a funcao de custo e a metrica utilizada
    nn.compile(loss = 'mean_squared_error', optimizer = adam,
    metrics = ['mean_squared_error'])
    history= nn.fit(X_train, y_train, batch_size = 1, epochs = 1000)

    #previsoes na rede no conjunto de treinamento
    predict_train =nn.predict(X_train)



    #Plota o treinamento
    fig = plt.figure()
    ax1 = fig.add_subplot(111)
    ax1.scatter(X_train, y_train, s=5, c='b', marker="o", label='real')
    ax1.plot(X_train,predict_train, c='r', label='NN Prediction')


    #Conjunto de Teste
    X_test = np.arange(0.0, 1, 0.01).reshape(-1, 1)
    y_test = np.sin(2*np.pi*X_test) + np.random.normal(0,0.2,100).reshape(-1,1)
    y_test=y_test.ravel()


    #Calcula as previsoes no conjunto de teste

    predict_test= nn.predict(X_test)

    fig = plt.figure()
    ax1 = fig.add_subplot(111)
    ax1.scatter(X_test, y_test, s=5, c='b', marker="o", label='real')
    ax1.plot(X_test,predict_test, c='r', label='NN Prediction')

    plt.legend()
    plt.show()

    print('MSE training : :.3f'.format(mean_squared_error(y_train, predict_train)))
    print('MSE testing : :.3f'.format(mean_squared_error(y_test, predict_test)))


    I already tried sgd as optimizer and also tanh for activation function. I do not undestand what I am missing, that is why I cann make the code for function approximation using Keras work.










    share|improve this question









    $endgroup$














      0












      0








      0





      $begingroup$


      I have different results in a function approximation problem. I am trying to approximate a sine wave using MLPRegressor and Keras (um dense layer)
      Here is the code for the MLPRegressor:



      import numpy as np
      import pandas as pd
      import matplotlib.pyplot as plt
      from sklearn.neural_network import MLPRegressor
      from sklearn.metrics import mean_squared_error

      #Cria um dataset
      X_train = np.arange(0.0, 1, 0.01).reshape(-1, 1)
      noise= np.random.normal(0,0.1,100).reshape(-1,1)



      y_train = np.sin(2*np.pi*X_train)
      y_train=y_train + noise
      y_train=y_train.ravel() # transfoprma em 1D array

      #X_train = np.arange(0.0, 1, 0.01).reshape(-1, 1)
      #y_train = np.sin(2 * np.pi * X_train).ravel()


      # Experimentos
      #hidden_layer sizes : 1,3, 100
      #max_iter=10,100,1000
      #
      nn = MLPRegressor(
      hidden_layer_sizes=(3,), activation='tanh', solver='lbfgs', alpha=0.000, batch_size='auto',
      learning_rate='constant', learning_rate_init=0.01, power_t=0.5, max_iter=80, shuffle=True,
      random_state=0, tol=0.0001, verbose=True, warm_start=False, momentum=0.0, nesterovs_momentum=False,
      early_stopping=False, validation_fraction=0.0, beta_1=0.9, beta_2=0.999, epsilon=1e-08)

      #Treina a Rede
      n = nn.fit(X_train, y_train)

      #previsoes na rede no conjunto de treinamento
      predict_train =nn.predict(X_train)

      #Plota o treinamento
      fig = plt.figure()
      ax1 = fig.add_subplot(111)
      ax1.scatter(X_train, y_train, s=5, c='b', marker="o", label='real')
      ax1.plot(X_train,predict_train, c='r', label='NN Prediction')


      #Conjunto de Teste
      X_test = np.arange(0.0, 1, 0.01).reshape(-1, 1)
      y_test = np.sin(2*np.pi*X_test) + np.random.normal(0,0.2,100).reshape(-1,1)
      y_test=y_test.ravel()


      #Calcula as previsoes no conjunto de teste

      predict_test= nn.predict(X_test)

      fig = plt.figure()
      ax1 = fig.add_subplot(111)
      ax1.scatter(X_test, y_test, s=5, c='b', marker="o", label='real')
      ax1.plot(X_test,predict_test, c='r', label='NN Prediction')

      plt.legend()
      plt.show()

      print('MSE training : :.3f'.format(mean_squared_error(y_train, predict_train)))
      print('MSE testing : :.3f'.format(mean_squared_error(y_test, predict_test)))


      Using MLPRegressor, I found satisfactory results with just 3 neurons. However, when I try to use Keras, I can not get reasonably results. The code is very similar with the exception of the optmizer and the activation function. Here is the code for Keras:



      import numpy as np
      import pandas as pd
      import matplotlib.pyplot as plt

      import keras
      from keras.models import Sequential
      from keras.layers import Dense
      from sklearn.metrics import mean_squared_error





      #
      #Cria um dataset

      #Cria um dataset
      X_train = np.arange(0.0, 1, 0.01).reshape(-1, 1)
      noise= np.random.normal(0,0.1,100).reshape(-1,1)



      y_train = np.sin(2*np.pi*X_train)
      y_train=y_train + noise
      y_train=y_train.ravel() # transfoprma em 1D array


      #Construir a Rede
      nn = Sequential() # sequencia de camada
      #activation
      # sigmoid, tanh, relu, linear
      # units: numero de neuronios na camada
      #primeira camada escondida tem input_dim
      nn.add(Dense(units = 100, activation = 'relu',
      kernel_initializer = 'random_uniform', input_dim = 1))
      nn.add(Dense(units = 1, activation = 'linear'))

      # Algorritmo de aprendizado
      #sgd = keras.optimizers.SGD(lr=0.1, decay=0, momentum=0, nesterov=False)
      adam=keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)

      #determina a funcao de custo e a metrica utilizada
      nn.compile(loss = 'mean_squared_error', optimizer = adam,
      metrics = ['mean_squared_error'])
      history= nn.fit(X_train, y_train, batch_size = 1, epochs = 1000)

      #previsoes na rede no conjunto de treinamento
      predict_train =nn.predict(X_train)



      #Plota o treinamento
      fig = plt.figure()
      ax1 = fig.add_subplot(111)
      ax1.scatter(X_train, y_train, s=5, c='b', marker="o", label='real')
      ax1.plot(X_train,predict_train, c='r', label='NN Prediction')


      #Conjunto de Teste
      X_test = np.arange(0.0, 1, 0.01).reshape(-1, 1)
      y_test = np.sin(2*np.pi*X_test) + np.random.normal(0,0.2,100).reshape(-1,1)
      y_test=y_test.ravel()


      #Calcula as previsoes no conjunto de teste

      predict_test= nn.predict(X_test)

      fig = plt.figure()
      ax1 = fig.add_subplot(111)
      ax1.scatter(X_test, y_test, s=5, c='b', marker="o", label='real')
      ax1.plot(X_test,predict_test, c='r', label='NN Prediction')

      plt.legend()
      plt.show()

      print('MSE training : :.3f'.format(mean_squared_error(y_train, predict_train)))
      print('MSE testing : :.3f'.format(mean_squared_error(y_test, predict_test)))


      I already tried sgd as optimizer and also tanh for activation function. I do not undestand what I am missing, that is why I cann make the code for function approximation using Keras work.










      share|improve this question









      $endgroup$




      I have different results in a function approximation problem. I am trying to approximate a sine wave using MLPRegressor and Keras (um dense layer)
      Here is the code for the MLPRegressor:



      import numpy as np
      import pandas as pd
      import matplotlib.pyplot as plt
      from sklearn.neural_network import MLPRegressor
      from sklearn.metrics import mean_squared_error

      #Cria um dataset
      X_train = np.arange(0.0, 1, 0.01).reshape(-1, 1)
      noise= np.random.normal(0,0.1,100).reshape(-1,1)



      y_train = np.sin(2*np.pi*X_train)
      y_train=y_train + noise
      y_train=y_train.ravel() # transfoprma em 1D array

      #X_train = np.arange(0.0, 1, 0.01).reshape(-1, 1)
      #y_train = np.sin(2 * np.pi * X_train).ravel()


      # Experimentos
      #hidden_layer sizes : 1,3, 100
      #max_iter=10,100,1000
      #
      nn = MLPRegressor(
      hidden_layer_sizes=(3,), activation='tanh', solver='lbfgs', alpha=0.000, batch_size='auto',
      learning_rate='constant', learning_rate_init=0.01, power_t=0.5, max_iter=80, shuffle=True,
      random_state=0, tol=0.0001, verbose=True, warm_start=False, momentum=0.0, nesterovs_momentum=False,
      early_stopping=False, validation_fraction=0.0, beta_1=0.9, beta_2=0.999, epsilon=1e-08)

      #Treina a Rede
      n = nn.fit(X_train, y_train)

      #previsoes na rede no conjunto de treinamento
      predict_train =nn.predict(X_train)

      #Plota o treinamento
      fig = plt.figure()
      ax1 = fig.add_subplot(111)
      ax1.scatter(X_train, y_train, s=5, c='b', marker="o", label='real')
      ax1.plot(X_train,predict_train, c='r', label='NN Prediction')


      #Conjunto de Teste
      X_test = np.arange(0.0, 1, 0.01).reshape(-1, 1)
      y_test = np.sin(2*np.pi*X_test) + np.random.normal(0,0.2,100).reshape(-1,1)
      y_test=y_test.ravel()


      #Calcula as previsoes no conjunto de teste

      predict_test= nn.predict(X_test)

      fig = plt.figure()
      ax1 = fig.add_subplot(111)
      ax1.scatter(X_test, y_test, s=5, c='b', marker="o", label='real')
      ax1.plot(X_test,predict_test, c='r', label='NN Prediction')

      plt.legend()
      plt.show()

      print('MSE training : :.3f'.format(mean_squared_error(y_train, predict_train)))
      print('MSE testing : :.3f'.format(mean_squared_error(y_test, predict_test)))


      Using MLPRegressor, I found satisfactory results with just 3 neurons. However, when I try to use Keras, I can not get reasonably results. The code is very similar with the exception of the optmizer and the activation function. Here is the code for Keras:



      import numpy as np
      import pandas as pd
      import matplotlib.pyplot as plt

      import keras
      from keras.models import Sequential
      from keras.layers import Dense
      from sklearn.metrics import mean_squared_error





      #
      #Cria um dataset

      #Cria um dataset
      X_train = np.arange(0.0, 1, 0.01).reshape(-1, 1)
      noise= np.random.normal(0,0.1,100).reshape(-1,1)



      y_train = np.sin(2*np.pi*X_train)
      y_train=y_train + noise
      y_train=y_train.ravel() # transfoprma em 1D array


      #Construir a Rede
      nn = Sequential() # sequencia de camada
      #activation
      # sigmoid, tanh, relu, linear
      # units: numero de neuronios na camada
      #primeira camada escondida tem input_dim
      nn.add(Dense(units = 100, activation = 'relu',
      kernel_initializer = 'random_uniform', input_dim = 1))
      nn.add(Dense(units = 1, activation = 'linear'))

      # Algorritmo de aprendizado
      #sgd = keras.optimizers.SGD(lr=0.1, decay=0, momentum=0, nesterov=False)
      adam=keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)

      #determina a funcao de custo e a metrica utilizada
      nn.compile(loss = 'mean_squared_error', optimizer = adam,
      metrics = ['mean_squared_error'])
      history= nn.fit(X_train, y_train, batch_size = 1, epochs = 1000)

      #previsoes na rede no conjunto de treinamento
      predict_train =nn.predict(X_train)



      #Plota o treinamento
      fig = plt.figure()
      ax1 = fig.add_subplot(111)
      ax1.scatter(X_train, y_train, s=5, c='b', marker="o", label='real')
      ax1.plot(X_train,predict_train, c='r', label='NN Prediction')


      #Conjunto de Teste
      X_test = np.arange(0.0, 1, 0.01).reshape(-1, 1)
      y_test = np.sin(2*np.pi*X_test) + np.random.normal(0,0.2,100).reshape(-1,1)
      y_test=y_test.ravel()


      #Calcula as previsoes no conjunto de teste

      predict_test= nn.predict(X_test)

      fig = plt.figure()
      ax1 = fig.add_subplot(111)
      ax1.scatter(X_test, y_test, s=5, c='b', marker="o", label='real')
      ax1.plot(X_test,predict_test, c='r', label='NN Prediction')

      plt.legend()
      plt.show()

      print('MSE training : :.3f'.format(mean_squared_error(y_train, predict_train)))
      print('MSE testing : :.3f'.format(mean_squared_error(y_test, predict_test)))


      I already tried sgd as optimizer and also tanh for activation function. I do not undestand what I am missing, that is why I cann make the code for function approximation using Keras work.







      keras mlp






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Apr 6 at 19:24









      Jorge AmaralJorge Amaral

      11




      11




















          0






          active

          oldest

          votes












          Your Answer








          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "557"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48773%2fdiffirent-results-in-a-function-approximation-problem-using-mlpregressor-and-ker%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          0






          active

          oldest

          votes








          0






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Data Science Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48773%2fdiffirent-results-in-a-function-approximation-problem-using-mlpregressor-and-ker%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

          Tähtien Talli Jäsenet | Lähteet | NavigointivalikkoSuomen Hippos – Tähtien Talli

          Do these cracks on my tires look bad? The Next CEO of Stack OverflowDry rot tire should I replace?Having to replace tiresFishtailed so easily? Bad tires? ABS?Filling the tires with something other than air, to avoid puncture hassles?Used Michelin tires safe to install?Do these tyre cracks necessitate replacement?Rumbling noise: tires or mechanicalIs it possible to fix noisy feathered tires?Are bad winter tires still better than summer tires in winter?Torque converter failure - Related to replacing only 2 tires?Why use snow tires on all 4 wheels on 2-wheel-drive cars?