lundi 6 novembre 2023

How do we set a dictionary using reflection if we don't know the type arguments during compile-time?

We have a dictionary in some unknown class, Dictionary<TKey, TValue> MyDictionary and we have the FieldInfo object obtained using reflection. We also have the type arguments Type keyType = typeof(TKey) and Type valueType = typeof(TValue) which are obtained during runtime.

The key-value pairs are present in two separate lists List<object> keys and List<object> values.

How do we generate the dictionary and set its value using FieldInfo.SetValue()?

So far, we tried this:

var dictionary = Activator.CreateInstance(fieldInfo.FieldType);
var keys = new List<object>();
var values = new List<object>();
// Code to set keys and values
for (int i = 0; i < keys.Count; i++)
   (dictionary as IDictionary).Add(keys[i], values[i]);

fieldInfo.SetValue(mainObj, dictionary);

In specific case where TValue is List<int>, the add statement throws an exception

System.ArgumentException: 'The value "System.Collections.Generic.List1[System.Object]" is not of type "System.Collections.Generic.List1[System.Int32]" and cannot be used in this generic collection.





Create Scala case classes for SCIO type safe read/write dynamically

I'm trying to generate Scala classes dynamically, which then should be used as safe types in SCIO read/write to/from GCP BigQuery.

Target example:

import com.spotify.scio.bigquery._
import com.spotify.scio.bigquery.types.BigQueryType

@BigQueryType.fromTable("dataset.SOURCE_TABLE")
class SOURCE_TABLE

@BigQueryType.toTable
case class TARGET_TABLE(id: String, name: String, desc: String)

def main(cmdlineArgs: Array[String]): Unit = {
  val (sc, args) = ContextAndArgs(cmdlineArgs)
  sc.typedBigQuery[SOURCE_TABLE]()  // Read from BQ
    .map( row => transformation(row) ) // Transform -> SCollection[TARGET_TABLE]
    .saveAsTypedBigQueryTable(Table.Spec(args("TARGET_TABLE")))  // save to BQ
  sc.run()
  ()
}

As as input there are dataset, SOURCE_TABLE, TARGET_TABLE, list of target fields, so I can build up a string source of generated classes. All these values are retrieved dynamically from other 3rd party (json, xml, etc. ) and can be mutable by every execution.

So, the source of generated classes can be presented as:

val sourceString =
  s"""
     |import com.spotify.scio.bigquery.types.BigQueryType
     |
     |@BigQuery.fromTable("$dataset.$SOURCE_TABLE")
     |class $SOURCE_TABLE
     |
   """.stripMargin

val targetString =
  s"""
     |import com.spotify.scio.bigquery.types.BigQueryType
     |
     |@BigQueryType.toTable
     |case class $TARGET_TABLE($fieldDefinitions)
   """.stripMargin

These sources are considered to be translated to classes, which types are required for SCIO BigQuery I/O.

Scala version: 2.12.17

I tried to use Scala runtime Mirror and Toolbox (from this answer, from this, etc.). But all variants throw the same error: enable macro paradise (2.12) or -Ymacro-annotations (2.13) to expand macro annotations It's obvious that the Toolbox's internal compiler doesn't see the build.sbt settings:

addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.1" cross CrossVersion.full)

Besides that here mentioned that Toolbox is not intended for such complex thing.

So, I decided to apply an approach using the package scala.tools.nsc as described in this answer. But it throws the same error about the lack of macro annotations.

Thus the main question: is there any chance to add required compiler plugin settings to scala.tools.nsc.{Global, Settings} or to apply any other approach to generate such annotated classes dynamically?

def compileCode(sources: List[String], classpathDirectories: List[AbstractFile], outputDirectory: AbstractFile): Unit = {
  val settings = new Settings
  classpathDirectories.foreach(dir => settings.classpath.prepend(dir.toString))
  settings.outputDirs.setSingleOutput(outputDirectory)
  settings.usejavacp.value = true
  //*****
  // Add macros paradise compiler plugin?
  //*****
  val global = new Global(settings)
  val files = sources.zipWithIndex.map { case (code, i) => new BatchSourceFile(s"(inline-$i)", code) }
  (new global.Run).compileSources(files)
}




dimanche 5 novembre 2023

Why can't I use a lamda expression when calling Constructor.newInstance()?

I am trying to instantiate an object using Constructor.newInstance. The constructor requires one argument of type DoubleSupplier. This is successful if I first create a DoubleSupplier object, then pass the object to the newInstance method:

DoubleSupplier  supplier    = () -> 3.0;
obj = ctor.newInstance( supplier );

But if I try using the lamda directly in the newInstance invocation:

obj = ctor.newInstance( () -> 3.0 );

it fails to compile with "The target type of this expression must be a functional interface." What is the difference between the two methods?

Incidentally, I can use the lambda when instantiating an object using "new".

obj2 = new SubTypeA( () -> 3.0 );

Sample program follows.

public class CtorDemo
{
    public static void main(String[] args)
    {
        SuperType   obj = getSubType( SubTypeA.class );
        System.out.println( obj.getClass().getName() );
    }
    
    private static SuperType 
    getSubType( Class<? extends SuperType> clazz )
    {
        SuperType   obj = null;
        try
        {
            Constructor<? extends SuperType> ctor    =
                clazz.getConstructor( DoubleSupplier.class );
            
            DoubleSupplier  supplier    = () -> 3.0;
            obj = ctor.newInstance( supplier );
            // obj = ctor.newInstance( () -> 3.0 );
            // compile error
        }
        catch ( NoSuchMethodException
                | InvocationTargetException 
                | IllegalAccessException 
                | InstantiationException exc )
        {
            exc.printStackTrace();
            System.exit( 1 );
        }
        return obj;
    }

    private static class SuperType
    {
    }
    private static class SubTypeA extends SuperType
    {
        public SubTypeA( DoubleSupplier supplier )
        {
        }
    }
}




samedi 4 novembre 2023

How can I skip fields of certain types during serialization?

I have a Spring AOP service which intercepts a lot of different third-party controllers endpoints. My service runs an endpoint, gets the result DTO, serialize DTO to json with ObjectMapper, sends json to Kafka and return the DTO outside. The problem is: the service shouldn't serialise fields of particular types (e.g MultipartFile). I can't use @JsonIgnore annotation, because DTO should be returned outside without changes. Can I make ObjectMapper skip fields of certain types? Or, maybe, copy the DTO and set this fields to null with reflection?





vendredi 3 novembre 2023

From a 700MB json file, how do I list all the keys in Powershell?

I tried

$obj = [System.IO.File]::ReadLines((Convert-Path -LiteralPath names.json)) | ConvertFrom-Json
 
$keys = @() 
 
foreach ($key in $obj.GetEnumerator()) { 
  $keys += $key.Key 
} 
 
Write-Output $keys

But after over 24 hours it had not completed.

I need the key names so I can

  1. Delete irrelevant info and make it smaller
  2. Convert it to csv (the key names are required, otherwise PS just uses the first object and ignores keys which are not present in the first object)

The JSON is a version of this one (though 200 megs smaller): https://kaikki.org/dictionary/All%20languages%20combined/by-pos-name/kaikki_dot_org-dictionary-all-by-pos-name.json





jeudi 2 novembre 2023

DotNet: Is there a newer, better, better maintained alternative to ReactiveX?

Reactive Extensions saved me lots of hours (even months) of coding time, but it seems to become out of focus. So I am looking for an alternative. I don't need fancy cross-language communication, nor do I need JS support or whatnot. Just a simple, easy-to-use abstraction of IObserver/IObservable that uses attributes.

As always, thanks tons!





mercredi 1 novembre 2023

getting all export type info using reflection for golang package

Project structure is like below


root
/ package1
executor struct
/ package2
model struct

inside package1 want to programmatically read all exported types of package2. I believe it is doable using reflection. Facing issue in finding the import path of package2.

Tried using https://pkg.go.dev/go/importer#pkg-functions importer package using

pkg:= importer.Default().Import("model") this does not work.

Tried using PkgPath

pkgPath := reflect.TypeOf(model.Policy{}).PkgPath()
cfg := &packages.Config{
        Mode:  packages.NeedTypes | packages.NeedTypesInfo,
        Tests: false,
        Env:   append(os.Environ(), "GO111MODULE=off", "USE_SYSTEM_GO=1",
    }
pkgs, err := packages.Load(cfg, pkgPath) 
pkg:= importer.Default().Import("model") this does not work.

this also does not work.

This is bazel based build system.