Spring Batch with Java Config

In a real life application we are building for one of our customers, we needed some form of batch operations with transaction management.
Without a doubt our first candidate was Spring Batch. The most common way to configure Spring Batch is via XML, but since the introduction of Java Configuration classes; we prefer to use this way of configuring applications (without xml, that is). We found out the hard way that in the case of Spring Batch, there is more to it than just translating the XML to Java Configuration; since a lot of issues come to the surface when wiring the application context. Since Spring Batch documentation tells you very little about Java Config,  I would like to give you a few insights in what we did to get it to work.

Split the Job Configuration and the Job’s plumbing

In our application we have Component Scan turned-off for Configuration Classes. This is to prevent test-configuration to be wired in the application.
Therefore we need to create a class where we can import all JobConfigurations (this class is imported into our ContextConfig class).

Besides pointing to the JobConfiguration classes, this class also creates a JobRegistryBeanPostProcessor  bean.
This is needed to Autowire a jobRegistry to the rest of your application. You need this class to access your own jobs later on.

@Configuration
@Import(MyJobConfiguration.class, MySecondJobConfiguration.class)
public class BatchJobConfiguration {

  @Bean
  JobRegistryBeanPostProcessor jobRegistryBeanPostProcessor(JobRegistry jobRegistry) {
    JobRegistryBeanPostProcessor postProcessor = new JobRegistryBeanPostProcessor();
      postProcessor.setJobRegistry(jobRegistry);
      return postProcessor;
  }
}

Configuring the TransactionManager

This is the most complex part when using Java Configuration. If you dive into the problem, it kind of feels like Spring did not really build Spring Batch for using Java Configuration classes, because a lot of timing issues are introduced. In the case of XML configuration the whole application context is being wired and after that spring starts configuring it’s setup. In the case of Java Configuration classes this process becomes a lot different.

The main problem: using an Autowired transaction does not work with Java Config

When you Autowire your JPA transaction manager; the latest version of Spring Batch (springBatchVersion 3.0.6.RELEASE) overrides this Bean with a default JDBC transaction manager. To overcome this behaviour, you need your own implementation of a Bean that implements BatchConfigurer. We tried to get one of the BatchConfigurer classes that Spring creates by default, and wire the Jpa Transaction Manger to that class; but in that case Spring wraps your TransactionManager into a LazyProxy. That will result in your application not saving any data anymore.

The Bean that you have to create is just a copy of Spring’s implementation. In this example we autowire our DataAccessConfig. This DataAccessConfig Bean has all the information about our TransactionManager and DataSource. Via this DataAccessConfig we wire our dataSource and TransactionManager to Spring Batch. When Spring Batch starts up, this BatchConfigurer Bean is Autowired into a List of BatchConfigurer Beans in Spring. Spring Batch now decides not to create any default, but to use this configuration without doing any other weird stuff to it.

import org.springframework.batch.core.configuration.annotation.BatchConfigurer;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.explore.JobExplorer;
import org.springframework.batch.core.explore.support.JobExplorerFactoryBean;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.launch.support.SimpleJobLauncher;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.core.repository.support.JobRepositoryFactoryBean;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import org.springframework.transaction.PlatformTransactionManager;

import javax.annotation.PostConstruct;

@Component
@EnableBatchProcessing(modular = true)
public class BatchConfiguration implements BatchConfigurer {

    @Autowired
    public BatchConfiguration(DataAccessConfig dataAccessConfig) {
        this.dataAccessConfig = dataAccessConfig;
    }

    private DataAccessConfig dataAccessConfig;

    private JobRepository jobRepository;
    private JobExplorer jobExplorer;
    private JobLauncher jobLauncher;

    @Override
    public JobRepository getJobRepository() throws Exception {
        return this.jobRepository;
    }

    @Override
    public PlatformTransactionManager getTransactionManager() {
        return this.dataAccessConfig.transactionManager();
    }

    @Override
    public JobLauncher getJobLauncher() throws Exception {
        return this.jobLauncher;
    }

    @Override
    public JobExplorer getJobExplorer() throws Exception {
        return this.jobExplorer;
    }

    @PostConstruct
    public void afterPropertiesSet() throws Exception {
        this.jobRepository = createJobRepository();

        JobExplorerFactoryBean jobExplorerFactoryBean = new JobExplorerFactoryBean();
        jobExplorerFactoryBean.setDataSource(this.dataAccessConfig.dataSource());
        jobExplorerFactoryBean.afterPropertiesSet();
        this.jobExplorer = jobExplorerFactoryBean.getObject();

        this.jobLauncher = createJobLauncher();
    }

    protected JobLauncher createJobLauncher() throws Exception {
        SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
        jobLauncher.setJobRepository(jobRepository);
        jobLauncher.afterPropertiesSet();
        return jobLauncher;
    }

    protected JobRepository createJobRepository() throws Exception {
        JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
        factory.setDataSource(this.dataAccessConfig.dataSource());
        factory.setTransactionManager(getTransactionManager());
        factory.afterPropertiesSet();
        return factory.getObject();
    }


}

Configuring the dialect

When doing this, there was one exception popping up: the database dialect was created with a default.
This is because even though your application is able to default to the Hibernate dialect, you need to configure it to help Spring Batch.

@Configuration
@EnableTransactionManagement
@EnableJpaRepositories("nl.mypackage.something")
public class DataAccessConfig {

  private static Logger LOGGER = LoggerFactory.getLogger(DataAccessConfig.class);

  @Autowired
  Environment env;

  @Bean
  public DataSource dataSource() {
    HikariConfig config = new HikariConfig();
    // note that it's even better to directly inject a DB DataSource in the config,
    // but that makes it DB-specific. For now we therefore use the traditional 
    // method of configuring a driver.
    // Also note that we're using Spring Boot's property names for configuration.
    config.setJdbcUrl(env.getRequiredProperty("spring.datasource.url"));
    config.setUsername(env.getProperty("spring.datasource.username"));
    config.setPassword(env.getProperty("spring.datasource.password"));

    // setting catalog is optional. Note that Spring Boot does *not* define a
    // standard property name for this, but for consistency we use their prefix.
    config.setCatalog(env.getProperty("spring.datasource.catalog"));

    config.setMinimumIdle(env.getProperty("spring.datasource.min-idle", 
                                          Integer.class, 2));
    config.setMaximumPoolSize(env.getProperty("spring.datasource.max-active", 
                                              Integer.class, 100));

    // ensure that we're using read-committed, regardless of the default used 
    // by the DBMS
    config.setTransactionIsolation("TRANSACTION_READ_COMMITTED");

    // expose info about the pool through JMX
    config.setRegisterMbeans(true);

    return new HikariDataSource(config);
  }

  // THIS PART IS ONLY HERE FOR SPRING BATCH
  @Bean
  JpaVendorAdapter jpaVendorAdapter() {
    return new HibernateJpaVendorAdapter();
  }

  @Bean
  @DependsOn("flyway")
  LocalContainerEntityManagerFactoryBean entityManagerFactory() {
    LocalContainerEntityManagerFactoryBean emfBean = 
      new LocalContainerEntityManagerFactoryBean();
    emfBean.setDataSource(dataSource());
    emfBean.setPackagesToScan("nl.mypackage.something");
    emfBean.setJpaVendorAdapter(jpaVendorAdapter());

    Properties jpaProps = new Properties();
    jpaProps.put("hibernate.physical_naming_strategy", 
                 PostgresPhysicalNamingStrategy.class.getName());
    jpaProps.put("hibernate.hbm2ddl.auto", env.getProperty(
      "spring.jpa.hibernate.ddl-auto", "none"));
    jpaProps.put("hibernate.jdbc.fetch_size", env.getProperty(
      "spring.jpa.properties.hibernate.jdbc.fetch_size", 
      "200"));

    Integer batchSize = env.getProperty(
      "spring.jpa.properties.hibernate.jdbc.batch_size", 
      Integer.class, 100);
    if (batchSize > 0) {
      jpaProps.put("hibernate.jdbc.batch_size", batchSize);
      jpaProps.put("hibernate.order_inserts", "true");
      jpaProps.put("hibernate.order_updates", "true");
    }

    jpaProps.put("hibernate.show_sql", env.getProperty(
      "spring.jpa.properties.hibernate.show_sql", "false"));
    jpaProps.put("hibernate.format_sql",env.getProperty(
      "spring.jpa.properties.hibernate.format_sql", "false"));

    jpaProps.put("jadira.usertype.autoRegisterUserTypes", "true");
    emfBean.setJpaProperties(jpaProps);

    return emfBean;
  }

  @Bean
  public PlatformTransactionManager transactionManager() {
    return new JpaTransactionManager(entityManagerFactory().getObject());
  }

  @Bean
  Flyway flyway(DataSource dataSource) {
    Flyway flyway = new Flyway();
    flyway.setDataSource(dataSource);
    if (env.getProperty("flyway.clean-on-validation-error", 
                        Boolean.class, Boolean.FALSE)) {
      LOGGER.warn("Enabling Flyway cleanOnValidationError: "
                  + "this should NEVER be enabled on ACC/PRD environments!");
      flyway.setCleanOnValidationError(true);
    }
    // leaving other settings to default for now

    String[] locations = getFlywayLocations(dataSource);
    if (locations != null) {
      flyway.setLocations(locations);
      flyway.migrate();
    }
      return flyway;
  }

  private String[] getFlywayLocations(DataSource dataSource) {
    String jdbcUrl;
    try (Connection connection = dataSource.getConnection()) {
      jdbcUrl = connection.getMetaData().getURL();
    } catch (SQLException e) {
      LOGGER.warn("Could not get connection metadata. "
                  + "Not migrating database schema.", e);
      return null;
    }
    if (jdbcUrl.startsWith("jdbc:h2:")) {
      return new String[]{"db/migration/h2"};
    }
    if (jdbcUrl.startsWith("jdbc:postgresql:")) {
      return new String[]{"db/migration/postgresql"};
    }
    LOGGER.warn("Did not recognize DB type for JDBC URL {}. "
                + "Not migrating database schema.", jdbcUrl);
    return null;
  }
}

Configure the job

After all the other steps are taken care of, this is pretty straightforward. So below is just what you would recognise from a simple XML configuration.
Each job consists of Steps with can exist of either a Reader, Processor (optional) and Writer or just a simple Tasklet to call some code.
Without explaining this in too much detail, we’ll just let you read the Configuration class below.

@Configuration
public class ExtractJobConfiguration {

  @Autowired
  private StepBuilderFactory stepBuilderFactory;

  @Bean(name = "staxItemReader")
  @StepScope
  public StaxEventItemReader<MyClass> staxItemReader(
        @Value("#{stepExecutionContext['" + FILE_NAME + "']}") String pathToFile,
        Jaxb2Marshaller jaxb2Marshaller) {
    StaxEventItemReader<MyClass> staxEventItemReader = new StaxEventItemReader<>();
    staxEventItemReader.setFragmentRootElementName(
      "{nl:some:wsdls:namespace:1:standard}My_Class");
    staxEventItemReader.setUnmarshaller(jaxb2Marshaller);
    staxEventItemReader.setResource(new FileSystemResource(pathToFile));
    return staxEventItemReader;
  }


  @Bean(name = "processJpaToCsvStep")
  public Step processJpaToCsvStep(
      JpaPagingItemReader<MeteringPointConnectionExtract> jpaPagingItemReader, 
      @Qualifier("flatFileWriter") FlatFileItemWriter<MeteringPointConnectionExtract> 
      flatFileItemWriter) {

    return stepBuilderFactory.get("processJpaToCsvStep")
                             .<MyClass, MyJpaClass>chunk(250)
                             .reader(jpaPagingItemReader)
                             .writer(flatFileItemWriter)
                             .build();
  }

  @Bean(name = "flatFileWriter")
  @StepScope
  public FlatFileItemWriter<MyJpaClass> flatFileItemWriter(
      @Value("#{jobExecutionContext['" + FILE_NAME_CSV + "']}") String pathToCsv) {

    FlatFileItemWriter<MyJpaClass> writer = new FlatFileItemWriter<>();
    FileSystemResource fileSystemResource = new FileSystemResource(pathToCsv);

    if (fileSystemResource.exists()) {
      writer.setResource(fileSystemResource);
      DelimitedLineAggregator<MeteringPointConnectionExtract> delLineAgg = 
        new DelimitedLineAggregator<>();
      delLineAgg.setDelimiter(",");

      List<Field> allFieldsList = FieldUtils.getAllFieldsList(
        MeteringPointConnectionExtract.class);
      List<String> allFieldNames = new ArrayList<>();
      for (Field field : allFieldsList) {
        if (!field.isSynthetic() && !"serialVersionUID".equals(field.getName()) 
            && !List.class.equals(field.getType())
            && !MeteringPointAddressExtract.class.equals(field.getType())) {
          allFieldNames.add(field.getName());
        }
      }
      String[] allFields = new String[allFieldNames.size()];
      allFields = allFieldNames.toArray(allFields);

      BeanWrapperFieldExtractor<MyJpaClass> fieldExtractor = 
        new BeanWrapperFieldExtractor<>();
      fieldExtractor.setNames(allFields);
      delLineAgg.setFieldExtractor(fieldExtractor);

      writer.setLineAggregator(delLineAgg);
      return writer;
    }

    return null;
  }

  @Bean
  @StepScope
  public JpaPagingItemReader<MyJpaClass> jpaPagingItemReader(
      EntityManagerFactory entityManagerFactory,
      @Value("#{jobExecutionContext['" + REPORT_ID + "']}") Long reportId) {
    JpaPagingItemReader reader = new JpaPagingItemReader();
    reader.setEntityManagerFactory(entityManagerFactory);
    reader.setQueryString("SELECT m FROM MyObjects m WHERE m.reportId = ?1");
    reader.setParameterValues(Collections.<String, Object>singletonMap("1", reportId));
    reader.setPageSize(250);
    return reader;
  }

  @Bean(name = "processExtractFileStep")
  public Step processExtractFileStep(
      MeteringPointConnectionExtractItemProcessor itemProcessor,
      @Qualifier("staxItemReader") ItemReader<MyClass> itemReader,
      @Qualifier("staxItemWriter") ItemWriter<MyJpaClass> 
      meteringPointConnectionExtractItemWriter,
      PlatformTransactionManager platformTransactionManager) {

    return stepBuilderFactory //
             .get("processExtractFileStep") //
             .<MeteringPointExtractEnvelopePCPMP, 
               MeteringPointConnectionExtract>chunk(250) //
             .reader(itemReader) //
             .processor(itemProcessor) //
             .writer(meteringPointConnectionExtractItemWriter) //
             .transactionManager(platformTransactionManager) //
             .build();
  }

  @Bean
  public Jaxb2Marshaller getJaxb2Marshaller() {
    Jaxb2Marshaller jaxb2Marshaller = new Jaxb2Marshaller();
    jaxb2Marshaller.setMappedClass(MyClass.class);
    jaxb2Marshaller.setClassesToBeBound(MyJpaClass.class);
    return jaxb2Marshaller;
  }

  @Bean(name = "partitionMasterStep")
  public Step partitionMasterStep(
      @Qualifier("processExtractFileStep") Step processExtractFileStep, 
      IncomingFilesPartinioner incomingFilesPartinioner) {
    return stepBuilderFactory.get("partitionMasterStep")
                             .partitioner(processExtractFileStep)
                             .partitioner("processExtractFileStep", 
                                          incomingFilesPartinioner)
                             .build();
  }

  @Bean(name = "myComparisonStep")
  public Step myComparisonStep(MyComparisonTasklet myComparisonTasklet, 
                               PlatformTransactionManager jpaTransactionManager) {
    return stepBuilderFactory.get("myComparisonTasklet")
                             .tasklet(myComparisonTasklet)
                             .transactionManager(jpaTransactionManager)
                             .build();
  }

  @Bean(name = "createCsvFileStep")
  public Step createCsvFile(CreateCsvFileTasklet createCsvFileTasklet) {
    return stepBuilderFactory.get("createCsvFileTasklet")
                             .tasklet(createCsvFileTasklet)
                             .build();
  }

  @Bean(name = "mailCsvStep")
  public Step mailCsvStep(MailCsvTasklet mailCsvTasklet) {
    return stepBuilderFactory.get("mailCsvTasklet")
                             .tasklet(mailCsvTasklet)
                             .build();
  }

  @Bean(name = "myComparisonStep")
  public Step myComparisonStep(MyComparisonTasklet myComparisonTasklet, 
      PlatformTransactionManager jpaTransactionManager) {
    return stepBuilderFactory.get("myComparisonStep")
                             .tasklet(myComparisonTasklet)
                             .transactionManager(jpaTransactionManager)
                             .build();
  }

  @Bean(name = "fetchFileStep")
  public Step fetchFileStep(FetchFilesTasklet fetchFileTasklet, 
      PlatformTransactionManager jpaTransactionManager) {
    return stepBuilderFactory.get("fetchFileStep")
                             .tasklet(fetchFileTasklet)
                             .transactionManager(jpaTransactionManager)
                             .build();
  }

  @Bean(name = "staxItemWriter")
  public ItemWriter<MeteringPointConnectionExtract> staxItemWriter(
      EntityManagerFactory entityManagerFactory) {
    JpaItemWriter<MeteringPointConnectionExtract> 
      meteringPointConnectionExtractJpaItemWriter = 
      new JpaItemWriter<>();
    meteringPointConnectionExtractJpaItemWriter.setEntityManagerFactory(
      entityManagerFactory);
    return meteringPointConnectionExtractJpaItemWriter;
  }

  @Bean(name = "signOffAndCleanupFileStep")
  public Step signOffAndCleanupFileStep(SignOffAndCleanupFileTasklet 
      signOffAndCleanupFileTasklet, PlatformTransactionManager 
      jpaTransactionManager) {
    return stepBuilderFactory.get("signOffAndCleanupFileStep")
                             .tasklet(signOffAndCleanupFileTasklet)
                             .transactionManager(jpaTransactionManager)
                             .build();
  }

  @Bean
  public Job extractJob(JobBuilderFactory jobBuilderFactory,
      @Qualifier("fetchFileStep") Step fetchFileStep,
      @Qualifier("partitionMasterStep") Step partitionMasterStep,
      @Qualifier("createCsvFileStep") Step createCsvFileStep,
      @Qualifier("processJpaToCsvStep") Step processJpaToCsvStep,
      @Qualifier("myComparisonStep") Step myComparisonStep,
      @Qualifier("myOtherComparisonStep") Step myOtherComparisonStep,
      @Qualifier("mailCsvStep") Step mailCsvStep,
      @Qualifier("signOffAndCleanupFileStep") Step signOffAndCleanupFileStep) {

    return jobBuilderFactory.get("extractJob")
                            .start(fetchFileStep)
                            .next(partitionMasterStep)
                            .next(createCsvFileStep)
                            .next(processJpaToCsvStep)
                            .next(myComparisonStep)
                            .next(myOtherComparisonStep)
                            .next(mailCsvStep)
                            .next(signOffAndCleanupFileStep)
                            .build();
  }
}

A small bonus paragraph: configure the Spring Admin

Below there’s a small bonus on how to configure the Spring Batch Admin within your application on the path “/batch”. This should speak for itself (and otherwise I am available for questions on my email). 😉

import org.codehaus.jackson.map.ObjectMapper;
import org.springframework.batch.admin.service.JobService;
import org.springframework.batch.admin.service.SimpleJobServiceFactoryBean;
import org.springframework.batch.admin.web.JobController;
import org.springframework.batch.admin.web.JobExecutionController;
import org.springframework.batch.admin.web.StepExecutionController;
import org.springframework.batch.admin.web.resources.DefaultResourceService;
import org.springframework.batch.core.configuration.ListableJobLocator;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.config.PropertiesFactoryBean;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.ImportResource;
import org.springframework.context.support.ResourceBundleMessageSource;
import org.springframework.core.io.ClassPathResource;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;

import javax.sql.DataSource;

@Configuration
@ImportResource(
  {"classpath*:/META-INF/spring/batch/servlet/manager/manager-context.xml",
   "classpath*:/META-INF/spring/batch/servlet/resources/resources-context.xml"})
public class SpringBatchAdminConfig {

  @Autowired
  private JobService jobService;

  @Bean
  public ObjectMapper jacksonMapObjectMapper() {
    return new ObjectMapper();
  }

  @Bean
  public JobController jobController() {
    return new JobController(jobService);
  }

  @Bean
  public StepExecutionController stepExecutionController(
      ObjectMapper jacksonMapObjectMapper) {
    StepExecutionController stepExecutionController = 
      new StepExecutionController(jobService);
    stepExecutionController.setObjectMapper(jacksonMapObjectMapper);
    return stepExecutionController;
  }

  @Bean
  public JobExecutionController jobExecutionController(
      ObjectMapper jacksonMapObjectMapper) {
    JobExecutionController jobExecutionController = 
      new JobExecutionController(jobService);
    jobExecutionController.setObjectMapper(jacksonMapObjectMapper);
    return jobExecutionController;
  }

  @Bean(name = "defaultResources")
  public PropertiesFactoryBean defaultResources() {
    PropertiesFactoryBean bean = new PropertiesFactoryBean();
    bean.setLocation(new ClassPathResource(
      "/org/springframework/batch/admin/web/manager/html-resources.properties"));
    return bean;
  }

  @Bean(name = "jsonResources")
  public PropertiesFactoryBean jsonResources() {
    PropertiesFactoryBean bean = new PropertiesFactoryBean();
    bean.setLocation(new ClassPathResource(
      "/org/springframework/batch/admin/web/manager/json-resources.properties"));
    return bean;
  }

  @Bean(name = "messageSource")
  public ResourceBundleMessageSource messageSource() {
    ResourceBundleMessageSource messageSource = new ResourceBundleMessageSource();
    messageSource.setBasenames(
      "org/springframework/batch/admin/web/manager/html-resources",
      "org/springframework/batch/admin/web/manager/json-resources");
    return messageSource;
  }

  @Bean(name = "resourceService")
  public DefaultResourceService defaultResourceService() {
    DefaultResourceService defaultResourceService = new DefaultResourceService();
    defaultResourceService.setServletPath("/batch/");
    return defaultResourceService;
  }

  @Bean(name = "jobService")
  public SimpleJobServiceFactoryBean jobService(JobRepository jobRepository, 
      JobLauncher jobLauncher, ListableJobLocator jobLocator, DataSource dataSource) {
    SimpleJobServiceFactoryBean factoryBean = new SimpleJobServiceFactoryBean();
    factoryBean.setJobRepository(jobRepository);
    factoryBean.setJobLauncher(jobLauncher);
    factoryBean.setJobLocator(jobLocator);
    factoryBean.setDataSource(dataSource);
    return factoryBean;
  }

  @Controller
  public static class FilesController {
    @RequestMapping(value = "/files/**", method = RequestMethod.GET)
    public String get() {
      return "standard";
    }
  }
}
Categories: iPROFS, Java

Back&: The ultimate backend service for Angular applications

06/10/2015 1 comment

“Backand is a feature­rich backend­as­a­service for Angular that takes care of all the yucky server­side stuff.”

While surfing the Internet in search of new technologies, I found this interesting service for Angular developers. Back& claims that it’s the ultimate backend service that you’re ever going to need for all your Angular applications. It’s understandable if you’ve never heard about Back& before, since this company is relatively new in the market. Although, the company is only 2 years old, their user base is growing very quickly. And it isn’t difficult to understand why if you look at all that they are offering for free:

  • 500 connections and 10GB data transfer (free-­forever promotion)
  • Social signup, email verification, role­based security, etc
  • Auto­generated REST­API for the database
  • Integration with third-party Services (like Paypal or MailChimp)
  • Fully hosted server­side and database in the cloud (Back& is automatically populated in Amazon’s AWS Relational Database Service)

It looks like a dream come true for all the Angular developers out there, right? So, I decided to give it a shot. Subscribing to the service was a very smooth process. It’s the typical registration procedure that you see in many other sites with email confirmation. Although, you can use Facebook or Google+ to register to Back&, I recommend that you use the normal registration form because you are going to need your login credentials later on. Anyway, once your registration is completed and validated, you’ll be presented with a nice dashboard from where you can manage and access all your “Apps”. Here you’ll have an example of an app that you can use to create and test your first Angular application.

 photo backand-dashboard_zpshyvwhx8j.png

In the “Getting started guide”, you can follow the 5 easy steps to include Back& in your Angular project. And if you don’t want to create an Angular project from scratch, you can follow the “Kickstart seed tutorial”. Following either or both tutorials should integrate Back& into your project in a matter of minutes.

One thing that’s worth mentioning is that you can use any existing database with Back&. If you decide to use an existing database, Back& will auto­generate the REST API based on that schema. Otherwise, you can create a new database using Back&’s application model. B​ack&’s model provides an abstraction layer on top of your database that you can use to create and update the database schema.

 photo abstract-layer_zps6ylo7p8w.png

At first glance, Back& looks quite impressive. I mean, you can create a fully working REST API in a matter of minutes, all the security is taken care of for you, social login is available out of the box using Oauth2 (Facebook, Google+ and GitHub are currently available with Linkedin and Twitter coming soon), email verification is already taken care of and implementing third-party services (like Paypal or MailChimp) is as easy as creating a Google account.

 photo social-login_zpsv6lyezss.png

My experience using Back& was really good. It took me around 10 minutes to create an Angular project and configure it to use Back&. Everything worked as expected. The REST API was created and configured as defined. Social login and email verification out of the box. I have no complaints about the whole experience. My only concern is that this company is quite new in the business, so they don’t have a long track record offering this kind of service. How good and trustworthy is this company in the long term? Only time can tell, I’m afraid. But they have definitely made a very good start, and if Back& can keep delivering good quality services in the next years, I have no doubt that it will be a great backend solution in the coming years.

Categories: Cloud, Frontend, iPROFS

Get your Hippo content via a REST-API

If you’ve already built a few Hippo sites, you know how to build an HST-2 (Hippo Site Toolkit) website to deliver your content to the visitors of your website. But maybe, you want to expose the plain content straight from the repository to other systems.
If that is the case, you might consider building a plain REST-API, as it is a perfect way to achieve this objective.

The HST-2 from Hippo does support integration of JAX-RS components out-of-the-box, so in just a few steps this goal can be achieved.

When starting with the Hippo archetype (and some bootstrapped news-content in the repository), you can just follow the next steps to create a rest-api that will deliver all news-documents that are tagged with location = “AMS”. Read more…

Categories: Hippo Tags: ,

How we scrum

For the last one-and-a-half year I have been working in a scrum-project in the role of member of the development team and later on also in the role of backup scrum master. One day I was talking with one of my iPROFS-collegues about how we have adapted scrum in this project and from his reaction the both of us concluded it might look like an ordinary scrum implementation, but maybe it is not…

Read more…

Categories: Scrum Tags:

Using the Spock Framework for unit tests

19/03/2015 Leave a comment

The Spock framework is a testing and specification framework built on top of JUnit and Groovy. It has many strengths, but its true power lies in its ability to write tests in a very readable, concise manner, which can dramatically cut your amount of testing code. I will show you some examples later.

Another big advantage is the orientation towards data driven testing. If you have a lot of unit tests where there are multiple input variables which can take different values and leading to different outputs, then definitely have a look at the “where” clause of the Spock framework.

And… it is powered by Groovy !! This gives you a whole lot of power to expressively write your test code, as anyone that ever used Groovy can attest.
Read more…

Categories: iPROFS

Spring, Hibernate and @Transactional : “aha” moments

10/03/2015 1 comment

Recently I have been sweating to make annotated transactions work in Spring MVC with Hibernate ORM and want to share with you some “aha” moments that took me some time/blood/sweat/tears to find out. Hopefully you’ll find it useful.

The goal for me was to annotate Service methods with a @Transactional annotation so that Spring will automatically create a transaction around anything Hibernate-related that happens inside the method.

Read more…

Categories: iPROFS

JSR 371: The last web framework you’ll ever look at

While a lot of companies still are adopting Java EE 7 the fundamentals of the next major Java enterprise edition are already finished. JSR 366 is to develop Java EE 8, the next release of the Java Platform Enterprise edition. Java EE 8 will bring us a new action based web framework. Will this be the last java web framework you’ll ever look at?

Read more…

Categories: iPROFS, Java Tags:

2014, a year in conferences

04/02/2015 Leave a comment

2014 has been a busy year for me. So much so, that I didn’t get around to writing blogs about all the conferences I visited. So let me give you a short(ish) summary of the whole year instead.

<tl;dr>
Read more…

Off to a great team start with Tuckman, Lencioni and Ofman!

10/01/2015 Comments off

When you start as Scrum Master on a new team, they are usually just a collection of individuals that together have the skill set necessary to fulfil the goal of the team. In order for that group to become high-performing, you have to let them turn into a team. Yes, you read it right, you cannot turn them into a team, they have to do that themselves. The only thing you can do about it to facilitate the process they need, so it can speed up a bit. Read more…

Categories: Scrum

Power to the Developers?

30/12/2014 Comments off

POwer to the people symbolWhen you start with creating your software using the Scrum framework, you notice a shift in power from the Project Manager to the team members. This might be a scary thought. Even more, there is no Project Manager anymore!  Read more…

Categories: Scrum Tags: ,