Saturday, December 3, 2016

Static Keyword in Java


The main purpose of static keyword in java is  memory management.
 The  static keyword can be used  in each of the  five cases.

  1.  static variables
  2.  static methods
  3.  static block
  4.  static nested class
  5.  Interface  static method(java 8 onward)

  • STATIC VARIABLES


In Java Variables can be declared with the “static” keyword.When a variable is declared with the keyword static, its called a class variable. All instances share the same copy of the variable. A class variable can be accessed directly with the class, without the need to create an instance.

Example: static int i = 0;

ADVANTAGE OF STATIC VARIABLE

It makes your program memory efficient (i.e it saves memory).
now let us execute a program without static variable.
Suppose there are 1500 students in a college, now all instance data members will get memory each time when object is created.All student have its unique rollno and name so instance data member is good.Here, college refers to the common property of all objects.if we don't  use static keyword at the variable college  then for all 1500 students the huge amount of memory space will be used which is not a good programming practice.
    class Student{
         int rollno;
         String name;
         String college="XYZ";
    }
program using static variable

package com.brainatjava.test;

public class Student {

   static String college ="XYZ";   int rollno;   String name;
  Student(int r,String n){   rollno = r;   name = n;   }
void display (){
System.out.println(rollno+" "+name+" "+college);}

public static void main(String args[]){ Student s1 = new Student(10,"Rabi"); Student s2 = new Student(20,"Rohit");
s1.display(); s2.display(); } }
OUTPUT
10 Rabi XYZ
20 Rohit XYZ


  •  STATIC METHOD


Static Methods can access class variables without using object of the class. It can access non-static methods and non-static variables by using objects. Static methods can be accessed directly in static and non-static methods.

EXAMPLE OF STATIC METHOD
public class Test1 {
static int i =10;

 //Static method
 static void display()
 {
    //Its a Static method
    System.out.println("i:"+Test1.i);
 }

 void foo()
 {
     //Static method called in non-static method
     display();
 }
 public static void main(String args[]) //Its a Static Method
 {
     //Static method called in another static method
     display();
     Test1 t1=new Test1();
      t1.foo();
  }
}

OUTPUT
i:10
i:10

STATIC BLOCK


   It is used to initialize the static data member.It is executed before main method at the time of classloading.A class can have multiple Static blocks, which will execute in the same sequence in which they have been written in the program.

EXAMPLE  OF SINGLE STATIC BLOCK

public class ExampleOfStaticBlock {
static int i;
  static String str;
  static{
     i =30;
     str = "welcome to BrainAtJava";
  }
  public static void main(String args[])
  {
     System.out.println("Value of i="+i);
     System.out.println("str="+str);
  }
}
OUTPUT
Value of i=30
str=welcome to BrainAtJava

EXAMPLE OF MULTIPLE STATIC BLOCK


public class ExampleOfMultipleStaticBlock {
static int i1;
static int i2;

  static String str1;
  static String str2;

  //First Static block
  static{
     i1 = 70;
     str1 = "Hello";
 }
 //Second static block
 static{
     i2= 55;
     str2 = "java";
 }
 public static void main(String args[])
 {
 System.out.println("Value of i1="+i1);
 System.out.println("Value of str1="+str1);

          System.out.println("Value of i2="+i2);
     System.out.println("Value of str2="+str2);
 
   
  }
}
OUTPUT
Value of i1=70
Value of str1=Hello
Value of i2=55
Value of str2=java

Static Nested Class:

A static nested class in Java is simply a class scoped within another class.
We can think of it as the static members of the enclosing class.
We can access it without creating an instance of the outer class.Simple it can be accessed by outerclass.innerclass.We can follow the below example.

A static nested class in Java serves a great advantage to namespace resolution.  For example, if we have a class with an  common name, and in a large project, it is quite possible that some other programmer has the same idea, and has a class with the same name you had, then we  can solve this  name clash by making our class a public static nested class. And our class will be written as outer class, followed by a period (.) and then followed by static nested class name.

Let's take an example

 class Outernormal
{
    private int var = 20;
    private static int staticVar = 50;
    public static void staticmethod(){
   System.out.println(var);//Error: Cannot make a static reference to the non-static field mem
   System.out.println(staticVar);
    }
    static class InnerStatic
    {
        public void getFields ()
        {
            System.out.println(var); //Error: Cannot make a static reference to the non-static field mem
            System.out.println(staticVar);
        }
    }
}

public class StaticClassDemo
{
    public static void main(String[] args)
    {
        OuterStatic.InnerStatic is = new OuterStatic.InnerStatic();
        is.getFields();
    }
}

Interface  static method(java 8 onward):

 In Java8 onwards , we can define static methods in interface ,but we can’t override them in the implementation classes.

This  helps us in avoiding undesired results in case of wrong implementation of interface.

It is good for providing utility methods,for example any precondition  check.

It provides security by not allowing implementing classes to override them.

Let's see code sample below.
public interface MyInf {

 static boolean checkIfNull(String str) {
  System.out.println("Interface Null Check");

  return str == null ? true : false;
 }
}

Sunday, October 23, 2016

Bloom Filters By Example

In this post we will discuss about bloom filter and its use case.Let's create a scenario like this first.
Assume there is cycle stand in our college.And the stand has 1000 slots for parking the cycles.And usually  one slot can have 4 cycles.So definitely that stand has capacity to have 4000 cycles.And it is very well known  that Mr. Akash keeps his cycle in slot no 1 every day.

So if we want to know if Akash is present in college today,we just check slot no 1 and if there is any cycle available there , we say yes Akash is present in college.But it is not hundred percent correct.As we said above each slot can have four cycles,it may be possible that cycle present in slot no 1 may not belongs to Akash.

So here a case arrises, which is false positive.But if no cycle is there in slot no 1, we say that definitely Akash is absent today.So there is no chance of false negative.That is we never say that Akash is absent today in case of his presence in college.

Bloom filter is a simple hash based  filter works  on the same principle.It allows to store elements and help us to quickly  identify many (not all) elements those are not present.Sometimes we can say that Akash is not in the college(if no cycle is there in slot 1).


Use Case:

Suppose we are going to create an anti virus software, which will maintain the list of malicious site and a list of know viruses.A naive approach is to maintain data structure to hold the details of all the malicious programmes.A problem with this approach is that it may consume a considerable amount of memory. If you know of a million malicious programmes, and programmes  need  an average of 10 bytes to store, then you need 10 megabytes of storage. That’s definitely an overhead . Is there any efficient way?Yes there  is.

Implementation Details:


We will do two things with bloom filter
1.insert an element in the filter.
2.Test an element if it is a member of bloom filter.

Bloom filter is a probabilistic data structure,that is not deterministic.We will came to know the reason in a while.

Let's take a bit array of size m.Initialize each position to zero.The idea here is that we choose k hash functions whose max value will be within the range 0 to m.Here k is constant such that k
To test for an element (whether it is in the set), feed it to each of the k hash functions to get k array positions. If any of the bits at these positions is 0, the element is definitely not in the set. if it were, then all the bits would have been set to 1 when it was inserted. If all are 1, then either the element is in the set, or the bits have been set by chance to 1 during the insertion of other elements, which results  false positive.

Deletion of elements are not allowed due to obvious reasons.Suppose we want to delete an element then defintely we need to set any of the  k bit positions generated by k hash functions for that element to 0.But there will be a chance that bit which we are going to set to 0 may be the result of some other elements.


To add an element in the filter simply pass that element to the k hash functions.Now we will get k different values between 0 to m-1 i.e we ll get k different array positions.Mark these positions to 1.Note that here we are not putting the exact hash value in array we are simply marking that position to 1.
As false negatives are not allowed in Bloom filter, we can't delete the elements from it.


Applications of Bloom Filter:


1.Cassandra uses bloom filters to save IO when performing a key lookup: each SSTable has a bloom filter associated with it that Cassandra checks before doing any disk seeks.
2.The Google Chrome web browser use a Bloom filter to identify malicious URLs. Any URL was first checked against a local Bloom filter, and only if the Bloom filter returned a positive result was a full check of the URL performed (and the user warned, if that too returned a positive result).


Although the Bloom Filter is a data structure ,it is called a filter because because it often used as a first pass to filter out elements of a data set that dont match a certain criteria.

Please refer wikipedia  for more applications and detailed explanations.

Wednesday, October 19, 2016

Least Recently Used (LRU) cache implementation in Java

I was working on a requirement where the requirement was to cache urls coming continuously from a source.And our cache has limited size.We can't afford to store all the urls.So here we decided to store 5 lakh most recently used urls in our cache.And those urls are not in use since a while(least recently used urls) will be evicted and new urls coming from the source will be added in the cache.If the new url is already present in cache we will mark the url as most recently used.
   
For more clarity on caches,please refer the blog post caches.  Now let's decide what data structure we will use.We can simply think of storing the urls in an LinkedList of size 5 lakh.But there are some limitations of using LinkedList.

Limitation 1:

What will happen if we want to find the least recently used urls.It is our frequent requirement as we want to evict the least recently used url from the cache to keep it free for other most recently used urls.

But the complexity for this is of order n i.e. O(n),which is not desired.

Limitation 2:

But what will happen if a url comes from the source and we want to check whether the url    already exist in the cache or not?For this we have to traverse the whole list.
  
But the complexity for this is of order n ie. O(n), which is not desired.
   
From this we conclude that LinkedList is not the correct choice for this.



To solve the first problem we use a doublyLinkedList,where the least recently used elements will be available in the tail of the list and can be accessed in O(1) time.And most recently used elements will be in the head of the list.

 To Solve the second problem we use a hash map,so that we  can check whether   an url is available in cache or not in O(1) time.

So to create a LRU cache, we have to take the help of two data structure namely a DoublyLinkedList and a HashMap.

Please see the implementation below.It is straight forward.




package com.brainatjava.lru;



import java.util.HashMap;

import java.util.Map;



public class LRUCache {

     

    private DoublyLinkedList urlList;

    private Map urleMap;
     
    public LRUCache(int cacheSize) {
      urlList = new DoublyLinkedList(4);
      urleMap = new HashMap();
    }
     
    public void accessURL(String url ) {
        Node pageNode = null;
        if(urleMap.containsKey(url)) {
            // If url is present in the cache, move the page to the head of list
            pageNode = urleMap.get(url);
            urlList.takeURLToHead(pageNode);
        } else {
            // If the page is not present in the urlcache, add the page to the urlcache
            if(urlList.getCurrSize() == urlList.getSize()) {
                // If cache is full, we will remove the tail from the cache 
                // and  remove it from urlmap.
              urleMap.remove(urlList.getTail().getURL());
            }
            pageNode = urlList.addPageToList(url);
            urleMap.put(url, pageNode);
        }
    }
     
    public void printCacheState() {
      urlList.printList();
        System.out.println();
    }

    public static void main(String[] args) {
        int cacheSize = 4;
        LRUCache cache = new LRUCache(cacheSize);
        cache.accessURL("http://a");
        cache.printCacheState();
        cache.accessURL("http://b");
        cache.printCacheState();
        cache.accessURL("http://a");
        cache.printCacheState();
        cache.accessURL("http://a");
        cache.printCacheState();
        cache.accessURL("http://d");
        cache.printCacheState();
        cache.accessURL("http://c");
        cache.printCacheState();
        cache.accessURL("http://g");
        cache.printCacheState();
        cache.accessURL("http://h");
        cache.printCacheState();
        cache.accessURL("http://c");
        cache.printCacheState();
    }
}

class DoublyLinkedList {
     
    private final int size;
    private int currSize;
    private Node head;
    private Node tail;

    public DoublyLinkedList(int size) {
        this.size = size;
        currSize = 0;
    }

    public Node getTail() {
        return tail;
    }

    public void printList() {
        if(head == null) {
            return;
        }
        Node tmp = head;
        while(tmp != null) {
            System.out.print(tmp);
            tmp = tmp.getNext();
        }
    }

    public Node addPageToList(String url) {
        Node pageNode = new Node(url);       
        if(head == null) {
            head = pageNode;
            tail = pageNode; 
            currSize = 1;
            return pageNode;
        } else if(currSize < size) {
            currSize++;
        } else {
            tail = tail.getPrev();
            tail.setNext(null);
        }
        pageNode.setNext(head);
        head.setPrev(pageNode);
        head = pageNode;
        return pageNode;
    }

    public void takeURLToHead(Node node) {
        if(node == null || node == head) {
            return;
        }

        if(node == tail) {
            tail = tail.getPrev();
            tail.setNext(null);
        }
         
        Node prev = node.getPrev();
        Node next = node.getNext();
        prev.setNext(next);

        if(next != null) {
            next.setPrev(prev);
        }

        node.setPrev(null);
        node.setNext(head);
        head.setPrev(node);
        head = node;    
    }

    public int getCurrSize() {
        return currSize;
    }

    public void setCurrSize(int currSize) {
        this.currSize = currSize;
    }

    public Node getHead() {
        return head;
    }

    public void setHead(Node head) {
        this.head = head;
    }

    public int getSize() {
        return size;
    }   
}

class Node {
     
    private String url;
    private Node prev;
    private Node next;
     
    public Node(String url) {
        this.url = url;
    }

    public String getURL() {
        return url;
    }

    public void setURL(String url) {
        this.url = url;
    }
     
    public Node getPrev() {
        return prev;
    }

    public void setPrev(Node prev) {
        this.prev = prev;
    }

    public Node getNext() {
        return next;
    }

    public void setNext(Node next) {
        this.next = next;
    }
     
    public String toString() {
        return url + "  ";
    }
}
 

 The method we applied here uses a DoublyLinkedList and HashMap.Double linked list is to maintain insertion order and finding the tail of the cache in O(1) time.And HashMap to check if an url is already exist in cache in O(1) time.

But Java has a lesser known  data structure know as LinkedHashMap ,which provides both the features of doublly linked list and hashmap.

  But here to remember that by default the LinkedHashMap  order is the insertion order, not access order.But there is a constructor of it to provide the access order for the same.We should use the below constructor LinkedHashMap(int initialCapacity, float loadFactor, boolean accessOrder) to solve the purpose.
 
  //Stright from the java doc
 A special constructor is provided to create a linked hash map whose order of iteration is the order in which its entries were last accessed, from least-recently accessed to most-recently (access-order). This kind of map is well-suited to building LRU caches. Invoking the put or get method results in an access to the corresponding entry (assuming it exists after the invocation completes). The putAll method generates one entry access for each mapping in the specified map, in the order that key-value mappings are provided by the specified map's entry set iterator. No other methods generate entry accesses. In particular, operations on collection-views do not affect the order of iteration of the backing map.
 link http://docs.oracle.com/javase/7/docs/api/java/util/LinkedHashMap.html

Let's see the implementation


package com.brainatjava.lru;

import java.util.LinkedHashMap;

public class LRUCache extends LinkedHashMap {
 private int size;

 private LRUCache(int size){
     super(size, 0.75f, true);
     this.size =size;
 }
  
 @Override
    protected boolean removeEldestEntry(java.util.Map.Entry eldest) {
        // TODO Auto-generated method stub
    return size() > size;
    }

 @Override
    public V get(Object key) {
        // TODO Auto-generated method stub
        return super.get(key);
    }

 public static  LRUCache newInstance(int size) {
     return new LRUCache(size);
 }

}

Thursday, October 13, 2016

Caches

First of all, let’s understand what is a cache? In plain computer science terms, a cache is small buffer of pages OS maintains in order to avoid more expensive main memory accesses.

Cache is usually present on the CPU chip itself.Main memory(RAM) is placed on the motherboard and is connected to the CPU.
Because cache is closer to the CPU, it is much faster than RAM. Each read access on the main memory has to travel to CPU while the CPU cache is right there.

Cache is more expensive than primary memory.

Why to have another temporary memory when we already have cheap and large main memory?


It is mainly to improve speed.The cache is there to reduce the average memory access time for the CPU.

When the CPU needs some data from the memory, the cache is checked first and if data is available in the cache it gets it from there. There is no need to perform a memory read.

Caches are faster than main memory, however, they smaller in size compared to main memory.  Therefore, there is probability pages are swapped in and out of cache. If a page is not found in cache and main memory is accessed to fetch that page, it’s a cache miss. Page is brought in cache and next time when it is accessed, it is served from cache.

What if there is no space left in cache when a cache miss occurs? New page as to swapped with one of already existing pages in cache. How to decide which page goes out of cache, so that there is minimum increase in cache miss? There are many approaches (eviction methods) to decide which page goes out of cache to make space for new page like First In First Out approach, Least Recently Used, Least Frequently Used etc.
What is least recently used cache ?

In ‘First In First Out’ approach, OS selects page which is oldest in cache and swaps that it with new page. In ‘Least Recently Used’ approach, OS selects the page which was not accessed for longest period of time. In ‘Least Frequently Used’ approach, OS selects the page which is accessed least number of time till a given point of time.

In this post, we would concentrate on Least Recently Used approach and implement it.

LRU cache is similar to first in first out (FIFO) storage where pages which came first are evicted first. Difference between FIFO storage and LRU cache comes from the fact that when a page is accessed again, that page is move to the top.

If a page is entered in cache first, it is first candidate to go out if it not accessed again in before cache is full and cache miss happens.

Here we will describe the implementation of LRU Cache in Java.

Friday, July 1, 2016

Determine the class name of an object

Sometimes it is required at runtime to get the class name of an object.Before we need to do any introspection on an object , we need to find its java.lang.Class object.All types in java like all object types,all primitive types,all array types etc have an associated  java.lang.Class object. Please note that I am using  Upper Case C for java.lang.Class and Lower Case c for class in general. If we know the name of a class at compile time we can find the Class(java.lang.Class) object of that class by using below syntax
 Class myclassObj = MyClass.class 
But in runtime to find the Class(java.lang.Class) object of the given object below syntax is used.
 Class myclassObj = myobj.getClass() 
Now to find the String representation of the name of the  class of we use the getName() method on the Class object.

String className=myClass.getName();
getName() will give us the fully qualified class name i.e class name along with package name.

If we  want to get only class name without package name prefixed with ,we can use getSimpleName() method on the Class object.
Let's see a working example

package com.brainatjava.test;

import java.util.HashMap;
import java.util.Map;


public class ClassTest {
    public static void main(String[] args) throws InterruptedException {

        String s ="good";
        System.out.println("class name is: " + s.getClass().getSimpleName());

        Map<String, String> m = new HashMap<>();
        System.out.println("class name is: " + m.getClass().getName());        

        Boolean b = new Boolean(true);
        System.out.println("class name is: " + b.getClass().getName());

        StringBuilder sb = new StringBuilder();
        Class c = sb.getClass();
        System.out.println("class name is: " + c.getName());

        int[] a=new int[3];
        System.out.println("class name is: " + a.getClass().getName());

        Integer[] in=new Integer[3];
        System.out.println("class name is: " + in.getClass().getName());

        double[] du=new double[3];
        System.out.println("class name is: " + du.getClass().getName());

        Double[] d=new Double[3];
        System.out.println("class name is: " + d.getClass().getName());
       
    }
    }

And we get the below output

class name is: String

class name is: java.util.HashMap

class name is: java.lang.Boolean

class name is: java.lang.StringBuilder

class name is: [I

class name is: [Ljava.lang.Integer;

class name is: [D

class name is: [Ljava.lang.Double;
If you are curious about the last four lines of the output,Then please go through the below explanation.

What is [I , [D,[Ljava.lang.Integer,[Ljava.lang.Double
As we saw in the above paragraph we can get the java.lang.Class object by calling getClass() method on the object.

 If this class object represents a reference type that is not an array type then the binary name of the class is returned, as specified by The Java™ Language Specification.

If this class object represents a primitive type or void, then the name returned is a String equal to the Java language keyword corresponding to the primitive type or void.

If this class object represents a class of arrays, then the internal form of the name consists of the name of the element type preceded by one or more '[' characters representing the depth of the array nesting. The encoding of element type names is as follows:

    Element Type             Encoding
    boolean                              Z
    byte                                    B
    char                                   C
    class or interface             Lclassname;
    double                                D
    float                                    F
    int                                       I
    long                                    J
    short                                  S
For more details please refer oracle official doc for method getName() .

Monday, June 27, 2016

operations on Java Streams -- continued

Filter Operation:

We can apply filter operation in an input stream  to produce another filtered stream.Suppose we have a finite stream of natural numbers but we want to filter the even numbers only,so we can apply filter operation here.Please note that ,unlike the map operation the elements in the filtered stream are  of the same type as the elements in the input stream.

The filter operation takes a functional interface Predicate as it's argument.Since the Predicate interface has a public method test which returns a boolean value, so we can pass a Lambda expression here as the argument to filter operation, which evaluates to a boolean value. 


 The size of the input stream is less than or equal to the size of the output stream.Please refer the below example.


Stream.of(1, 2, 3, 4, 5,6).filter(n->n%2==0).forEach(System.out::println);

Reduce Operation:

This combines all elements of a stream to generate a single result by applying a combining function repeatedly.Computing the sum, maximum, average, count etc.  are examples of the reduce operation.

The reduce operation takes two parameters  an initial value and an accumulator. The accumulator is the combining function. If the stream is empty, the initial value is the result. 


The initial value and an element are passed to the accumulator, which returns a partial result. This repeats until all elements in the stream are finished. The last value returned from the accumulator is the result of the reduce operation. 

 The Stream interface contains a reduce() method to perform the reduce operation. The method has three overloaded versions:  

1.Let's take the example for the first one

T reduce(T identity, BinaryOperator accumulator)


List<integer> numbers = Arrays.asList(1, 2, 3, 4, 5);
int sum = numbers.stream()
.reduce(0, Integer::sum);
System.out.println(sum);

Here notice that 0 is the initial value and  Integer::sum is the accumulator ie. the combining function.

2.Let's take example for the second one

U reduce(U identity, BiFunction accumulator,BinaryOperator combiner) 

Note that the second argument, which is the accumulator, takes an argument whose type may be different from the type of the stream. This is used for the accumulating the partial results. The third argument is used for combining the partial results when the reduce operation is performed in parallel.Then the result of all the different threads will be combined.But if we are not doing it in parallel , the combiner  has no use. 


int result = List<Employee>
.stream()
.reduce(0, (intermediteSum, employee) ->intermediateSum + employee.getSalary(), Integer::sum);
System.out.println(sum);


The above code shows how to calculate the sum of salary of all  the   employees by using the reduce operation.Here 0 is the initial value  and sec
ond argument is the accumulator and Integer::sum is the combiner

3.Let's take the example for the third one

Optional reduce(BinaryOperator accumulator)

Sometimes we cannot specify an initial value for a reduce operation.Let's assume we get a list of numbers on the fly.We have no idea whether the list is empty or it has sum elements and we want to get maximum integer
value from a the List of numbers. If the underlaying stream is empty, we cannot initialize the maximum value. In such a case, the result is not defined.This version of the reduce method returns an Optional object that contains the result. If  the stream contains only one element, that element is the result. 

The following snippet of code computes the maximum of integers in a stream:


Optional<integer> maxValue = Stream.of(1, 2, 3, 4, 5)
.reduce(Integer::max);
if (maxValue.isPresent()) {
System.out.println("max = " + maxValue.get());
}
else {
System.out.println("max is not available.");
}

Collect Operation:

We saw in case of reduce operation we get a single value as the result.But sometimes we want to collect a set of values as the result of stream pipeline operations.

Let's take an example.We have a map of users having user name as key and user Account No as value.This map contains all the users those are active and non active.And we have another List of names which contains only active users.And our requirement is to get all the active  Account numbers.So my result here will be  itself a List of active Account numbers.


Set<string> activeUserList= new HashSet<>();
Map<String,String> completeUserMap=new HashMap<>();

List<String> keys =completeUserMap.entrySet().stream().
filter(e->activeUserList.contains(e.getKey())).
map(Map.Entry::getKey).collect(Collectors.toList());


Here notice that first we are filtering completeUserMap with activeUserList and then using the map operation to get user Account number from the Map entry and then collecting the result in a List.

Here let's collect the same result in a map,ie. we will collect the map of active users and their Account numbers.


Map<String,String> values = completeUserMap.entrySet().stream().
filter(e->activeUserList.contains(e.getKey())).
collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));


We will learn  about  Collectors,parallel stream and operation reordering in details in our next series.

Wednesday, June 15, 2016

Usage of Java streams

In part1 of these series we saw basics of java stream.Now we discuss how to use the streams along with some important operations on it. Now let's see different ways to create streams.

Create streams from existing values:

There are two methods in stream interface to create stream from a single value and multiple values.

Stream stream = Stream.of("test");
Stream stream = Stream.of("test1", "test2", "test3", "test4");

Create empty stream :


Stream stream = Stream.empty();

Create Stream from function:

We can generate an infinite stream from a function that can produce infinite number of elements if required.There are two static methods iterate and generate in Stream interface to produce infinite stream.

 Stream iterate(T seed, UnaryOperator f)
 Stream generate(Supplier s);
The iterator() method takes two arguments: a seed and a function. The first argument is a seed that is the first element of the stream. The second element is generated by applying the function to the first element. The third element is generated by applying the function on the second element and so on.

The below example creates an infinite stream of natural numbers starting with 1.


Stream<Integer> naturalNumbers = Stream.iterate(1, n -> n + 1);
The generate(Supplier<T> s) method uses the specified Supplier to generate an infinite sequential unordered stream.Here Supplier is a functional interface, so we can use lambda expressions here.Lets see the below example to
generate an infinite stream of random numbers.Here we use method reference to generate random numbers.Please follow the series method reference( double colon perator) if you are not aware about it.

Stream.generate(Math::random).limit(5).forEach(System.out::println);

Create Stream from Collections:

Collection is the data-source we usually use for creating streams.The Collection interface contains the stream() and parallelStream() methods that create sequential and parallel streams from a Collection.

Example

Set nameSet = new HashSet<>();

//add some elements to the set

nameSet.add("name1");

nameSet.add("tes2");

//create a sequential stream from the nameSet

Stream sequentialStream = nameSet.stream();

// Create a parallel stream from the  nameSet

Stream parallelStream = nameSet.parallelStream(); 

Create Streams from Files:

Many methods are added to classes in java.io and java.nio.file  package in java 8 to facilitate IO operations by using streams.Let's see the example to read the content of the file using stream.

Path path = Paths.get(filePath);

Stream lines = Files.lines(path);

lines.forEach(System.out::println);
the method lines() added in Files class in java  1.8.Read all lines from a file as a Stream.

Stream Operations:

Now we will go through with some commonly used stream operations and their usage.
  1. Distinct
  2. filter
  3. flatMap
  4. limit
  5. map
  6. skip
  7. peek
  8. sorted
  9. allMatch
  10. anyMatch
  11. findAny
  12. findFirst
  13. noneMatch
  14. forEach
  15. reduce
Operations 1 to 8 are intermediate operations and 9 to 15 are terminal operations.  As some of the operations are self explanatory , so we discuss about those which are not trivial.

 

Map Operation:

                                                              

A map operation applies a function to each element of the input stream to produce another stream ( output stream ).The number of elements in the input and output streams are same. So this is a one to one mapping.The above figure shows the mapping.It take the element e1 and  apply function f on it to get f(e1) and so on.But  the type of elements in the  output stream may be different from  the type of elements in the input stream.Let's take an example.                                                                                                                         


Suppose  we have 1000 keys with values in redis data store and we want to fetch all the values of those keys and then we will perform some operation on them.We want to do it with future object,So how will we do it parallely with java  Stream.We will use thread pool service here to fetch the data from redis.Suppose our uniqueItemids List contains the list of keys.             

HashOperations redisHash=redisTemplate.opsForHash();

ExecutorService threadPoolService=Executors.newFixedThreadPool(10);

uniqueItemIds.

stream().

parallel().


map(itemId-> threadPoolService.submit(new Callable()) .forEach(future->{


try {


return future.get();


} catch (Exception e) {


e.printStackTrace();


return null;


}
Here the code in the callable's call method will be to fetch the data from redis with the specified item id.As we know the submit will return us the future object ,so map operation here takes an itemid which is of type long and return us an object of type future.Here I am emphasizing the point that  "the type of elements in the  output stream returned by the map operation may be different from the  type of elements in the input stream"

flatMap Operation:     

Unlike the map operation ,the Streams API  supports one-to-many mapping   through the flatMap.The mapping function takes an element from the input stream and maps the  element to a stream. The type of input element and the elements in the mapped    stream may be  different.This step produces a stream of streams.If the input stream is a Stream<T>  then the
mapped stream will be  Stream<Stream<R>> But which is  not   desired.Assume we have a map with below structure.

Map>> itemsMap = new ConcurrentHashMap<>()

//Now let's fill the map with some values.

itemsMap.put(2,  new ConcurrentHashMap<>());

itemsMap.put(3, new ConcurrentHashMap<>());

itemsMap.get(2).put(1L, Arrays.asList("abc","cde","def","rty"));

itemsMap.get(2).put(2L, Arrays.asList("2abc","2cde","2def","2rty"));

itemsMap.get(2).put(3L, Arrays.asList("3abc","3cde","3def","3rty"));

itemsMap.get(3).put(1L, Arrays.asList("abc3","cde3","def3","rty3"));

Now our aim is to get all the lists of strings in a stream.How can we achieve it?

A immediate solution comes to mind   is to write like below.

itemsMap.values().stream().parallel().map(m->m.values().stream()).forEach(System.out::println);

 Now we get the output as follows

java.util.stream.ReferencePipeline$Head@4eec7777
java.util.stream.ReferencePipeline$Head@3b07d329

We are expected to see list of strings in the output.But we don't find that.This is because of inside the map Stream of String is produced and we give Stream<Stream<String>> as the input to foreach.So we get the result.

Now our next attempt like this


itemsMap.

values().

stream().

parallel().

map(m->m.values().

stream()).

forEach(e->e.forEach(System.out::println));
And the output is   :

[abc, cde, def, rty]
[2abc, 2cde, 2def, 2rty]
[3abc, 3cde, 3def, 3rty]
[abc3, cde3, def3, rty3]

We are able to find all our Strings together but observe that they are still Stream<Stream<String>>  .Just we managed to write it in a different way in the for each loop.

The correct approach to our problem is


itemsMap.

values().

stream().

parallel().

flatMap(m->m.values().

stream()).

forEach(System.out::println);
So here comes the flatMap to the rescue.It flattens the Stream<Stream<String>>  and convert it into Stream<String>    .So make sure to use flatMap when you get Stream<Stream<T>>                     

We will discuss some other important operation in series 3.