Categories
discuss

JDK 15 Sealed Classes – how to use across packages?

I have a simple sealed class, MyShape:

public sealed class MyShape permits MyCircle {

    private final int width;
    private final int height;

    public MyShape(int width, int height) {
        this.width = width;
        this.height = height;
    }

    public int width() {
        return width;
    }

    public int height() {
        return height;
    }
}

And one simple subclass, MyCircle:

public final class MyCircle extends MyShape {

    public MyCircle(int width) {
        super(width, width);
    }
}

Everything compiles and works when both classes are in the same package. If I move MyCircle into a sub-package, then the build breaks with: java: class is not allowed to extend sealed class: org.example.MyShape.

My understanding from the JDK 15 docs is that this should work. Am I missing a step?

I’ve created a GitHub repo if you want to experiment.

Answer

As stated in the documentation that you have linked:

JDK 15 Documentation

They must be in the same module as the sealed class (if the sealed class is in a named module) or in the same package (if the sealed class is in the unnamed module).

Categories
discuss

How to download a ReadableStream on the browser that has been returned from fetch

I am receiving a ReadableStream from a server, returned from my fetch call.

A ReadableStream is returned but I don’t know how to trigger a download from this stage. I can’t use the url in an href because it requires an Authorization token.

I don’t want to install fs on the client so what options do I have?

  try {
    const res = await fetch(url, {
      method: 'GET',
      headers: {
        Authorization: `Bearer ${token}`,
        'Content-Type': 'application/octet-stream'
      }
    });

    const blob = await res.blob();

    const newBlob = new Blob([blob]);
    const newUrl = window.URL.createObjectURL(newBlob);

    const link = document.createElement('a');
    link.href = newUrl;
    link.setAttribute('download', 'filename');
    document.body.appendChild(link);
    link.click();
    link.parentNode.removeChild(link);

    window.URL.revokeObjectURL(newBlob);
  } catch (error) {
    console.log(error);
  }

Update 1

I converted the file to a Blob, then passed it into a newly generated href. Successfully downloaded a file. The end result was the ReadStream contents as a .txt file.

Meaning stuff like this

x:ÚêÒÓ%¶âÜTb∞܃

Answer

I have found 2 solutions, both worked but I was missing a simple addition to make them work.

The native solution is

  try {
    const res = await fetch(url, {
      method: 'GET',
      headers: {
        Authorization: `Bearer ${token}`
      }
    });

    const blob = await res.blob();
    const newBlob = new Blob([blob]);

    const blobUrl = window.URL.createObjectURL(newBlob);

    const link = document.createElement('a');
    link.href = blobUrl;
    link.setAttribute('download', `${filename}.${extension}`);
    document.body.appendChild(link);
    link.click();
    link.parentNode.removeChild(link);

    // clean up Url
    window.URL.revokeObjectURL(blobUrl);

This version is using the npm package steamSaver for anyone who would prefer it.

  try {
    const res = await fetch(url, {
      method: 'GET',
      headers: {
        Authorization: `Bearer ${token}`
      }
    });

    const fileStream = streamSaver.createWriteStream(`${filename}.${extension}`);
    const writer = fileStream.getWriter();

    const reader = res.body.getReader();

    const pump = () => reader.read()
      .then(({ value, done }) => {
        if (done) writer.close();
        else {
          writer.write(value);
          return writer.ready.then(pump);
        }
      });

    await pump()
      .then(() => console.log('Closed the stream, Done writing'))
      .catch(err => console.log(err));

The key for why it was not working was because I did not include the extension, so it either errored out because of the mimetype was wrong or it opens a .txt file with a string of the body instead of the image.

Categories
discuss

spring data mongodb calling save twice leads to duplicate key exception

I try to save an entity with spring data mongodb repository. I have an EventListener that cascades saves.

The problem is, that I need to save an entity to get its internal id and perform further state mutations and saving the entity afterwards.

 @Test
    void testUpdate() {
        FooDto fooDto = getResource("/json/foo.json", new TypeReference<FooDto>() {
        });
        Foo foo = fooMapper.fromDTO(fooDto);
        foo = fooService.save(foo);
        log.info("Saved foo: " + foo);
        foo.setState(FooState.Bar);
        foo = fooService.save(foo);
        log.info("Updated foo: " + foo);
    }

I have an index on a child collection of foo. It will not update children but will try to insert them twice which leads to org.springframework.dao.DuplicateKeyException.

Why does it not save but tries to insert it again?

Related:

Spring Data MongoRepository save causing Duplicate Key error


Edit: versions:

mongodb 4, spring boot 2.3.3.RELEASE


Edit more details:

Repository:

public interface FooRepository extends MongoRepository<Foo, String> 

Entity:

@Document
public class Foo {

    @Id
    private String id;
    private FooState state;


    @DBRef
    @Cascade
    private Collection<Bar> bars = new ArrayList<>();

    
 ...

}

CascadeMongoEventListener:

//from https://mflash.dev/blog/2019/07/08/persisting-documents-with-mongorepository/#unit-tests-for-the-accountrepository
public class CascadeMongoEventListener extends AbstractMongoEventListener<Object> {

    private @Autowired
    MongoOperations mongoOperations;

    public @Override void onBeforeConvert(final BeforeConvertEvent<Object> event) {
        final Object source = event.getSource();
        ReflectionUtils
                .doWithFields(source.getClass(), new CascadeSaveCallback(source, mongoOperations));
    }


    private static class CascadeSaveCallback implements ReflectionUtils.FieldCallback {

        private final Object source;
        private final MongoOperations mongoOperations;

        public CascadeSaveCallback(Object source, MongoOperations mongoOperations) {
            this.source = source;
            this.mongoOperations = mongoOperations;
        }

        public @Override void doWith(final Field field)
                throws IllegalArgumentException, IllegalAccessException {
            ReflectionUtils.makeAccessible(field);

            if (field.isAnnotationPresent(DBRef.class) && field.isAnnotationPresent(Cascade.class)) {
                final Object fieldValue = field.get(source);

                if (Objects.nonNull(fieldValue)) {
                    final var callback = new IdentifierCallback();
                    final CascadeType cascadeType = field.getAnnotation(Cascade.class).value();

                    if (cascadeType.equals(CascadeType.PERSIST) || cascadeType.equals(CascadeType.ALL)) {
                        if (fieldValue instanceof Collection<?>) {
                            ((Collection<?>) fieldValue).forEach(mongoOperations::save);
                        } else {
                            ReflectionUtils.doWithFields(fieldValue.getClass(), callback);
                            mongoOperations.save(fieldValue);
                        }
                    }
                }
            }
        }
    }


    private static class IdentifierCallback implements ReflectionUtils.FieldCallback {

        private boolean idFound;

        public @Override void doWith(final Field field) throws IllegalArgumentException {
            ReflectionUtils.makeAccessible(field);

            if (field.isAnnotationPresent(Id.class)) {
                idFound = true;
            }
        }

        public boolean isIdFound() {
            return idFound;
        }
    }
}

Edit: expected behaviour

From the docs in org.springframework.data.mongodb.core.MongoOperations#save(T):

Save the object to the collection for the entity type of the object to save. This will perform an insert if the object is not already present, that is an ‘upsert’.


Edit – new insights:

it might be related to the index on the Bar child collection. (DbRef and Cascade lead to mongoOperations::save being called from the EventListener)

I created another similar test with another entity and it worked.

The index on the child “Bar” entity (which is held as collection in parent “Foo” entity):

@CompoundIndex(unique = true, name = "fooId_name", def = "{'fooId': 1, 'name': 1}")

update: I think I found the problem. Since I am using a custom serialization/deserialization in my Converter (Document.parse()) the id field is not mapped properly. This results in id being null and therefore this leads to an insert instead of an update.

I will write an answer if I resolved this properly.

public class MongoResultConversion {

    @Component
    @ReadingConverter
    public static class ToResultConverter implements Converter<Document, Bar> {

        private final ObjectMapper mapper;

        @Autowired
        public ToResultConverter(ObjectMapper mapper) {
            this.mapper = mapper;
        }

        public MeasureResult convert(Document source) {
            String json = toJson(source);
            try {
                return mapper.readValue(json, new TypeReference<Bar>() {
                });
            } catch (JsonProcessingException e) {
                throw new RuntimeException(e);
            }
        }


        protected String toJson(Document source) {
            return source.toJson();
        }

    }



    @Component
    @WritingConverter
    public static class ToDocumentConverter implements Converter<Bar, Document> {

        private final ObjectMapper mapper;

        @Autowired
        public ToDocumentConverter(ObjectMapper mapper) {
            this.mapper = mapper;
        }

        public Document convert(Bar source) {

            String json = toJson(source);
            return Document.parse(json);

        }

        protected String toJson(Bar source) {
            try {
                return mapper.writeValueAsString(source);
            } catch (JsonProcessingException e) {
                throw new RuntimeException(e);
            }
        }
    }



}

Answer

As stated in my last edit the problem was with the custom serialization/deserialization and mongo document conversion. This resulted in id being null and therefore an insert was done instead of an upsert.

The following code is my implementation of my custom converter to map the objectid:

public class MongoBarConversion {

    @Component
    @ReadingConverter
    public static class ToBarConverter implements Converter<Document, Bar> {

        private final ObjectMapper mapper;

        @Autowired
        public ToBarConverter(ObjectMapper mapper) {
            this.mapper = mapper;
        }

        public Bar convert(Document source) {
            JsonNode json = toJson(source);
            setObjectId(source, json);
            return mapper.convertValue(json, new TypeReference<Bar>() {
            });
        }

        protected void setObjectId(Document source, JsonNode jsonNode) {
            ObjectNode modifiableObject = (ObjectNode) jsonNode;
            String objectId = getObjectId(source);
            modifiableObject.put(ID_FIELD, objectId);
        }

        protected String getObjectId(Document source) {
            String objectIdLiteral = null;
            ObjectId objectId = source.getObjectId("_id");
            if (objectId != null) {
                objectIdLiteral = objectId.toString();
            }
            return objectIdLiteral;
        }


        protected JsonNode toJson(Document source) {
            JsonNode node = null;
            try {
                String json = source.toJson();
                node = mapper.readValue(json, JsonNode.class);
            } catch (JsonProcessingException e) {
                throw new RuntimeException(e);
            }
            return node;
        }

    }


    @Component
    @WritingConverter
    public static class ToDocumentConverter implements Converter<Bar, Document> {

        private final ObjectMapper mapper;

        @Autowired
        public ToDocumentConverter(ObjectMapper mapper) {
            this.mapper = mapper;
        }

        public Document convert(Bar source) {
            try {
                JsonNode jsonNode = toJson(source);
                setObjectId(source, jsonNode);
                String json = mapper.writeValueAsString(jsonNode);
                return Document.parse(json);
            } catch (JsonProcessingException e) {
                throw new RuntimeException(e);
            }
        }

        protected void setObjectId(Bar source, JsonNode jsonNode) throws JsonProcessingException {
            ObjectNode modifiableObject = (ObjectNode) jsonNode;
            JsonNode objectIdJson = getObjectId(source);
            modifiableObject.set("_id", objectIdJson);
            modifiableObject.remove(ID_FIELD);
        }

        protected JsonNode getObjectId(Bar source) throws JsonProcessingException {
            ObjectNode _id = null;
            String id = source.getId();
            if (id != null) {
                _id = JsonNodeFactory.instance.objectNode();
                _id.put("$oid", id);
            }
            return _id;
        }

        protected JsonNode toJson(Bar source) {
            return mapper.convertValue(source, JsonNode.class);
        }
    }


}

So to conclude: two subsequent saves should (and will) definitely lead to an upsert if the id is non null. The bug was in my code.

Source: stackoverflow
Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. By using this site, you agree to the Privacy Policy, and Copyright Policy. Content is available under CC BY-SA 3.0 unless otherwise noted. The answers/resolutions are collected from stackoverflow, are licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0 © No Copyrights, All Questions are retrived from public domain..